TOTW 23: Ask Eve AI – Crafting the Future of Information Access
Ask Eve AI - A hybrid of AI and classic processing

TOTW 23: Ask Eve AI – Crafting the Future of Information Access

It’s time to dive back into the world of AI with some exciting updates. As mentioned in TOTW 20, I’ve been busy exploring new directions and have some thrilling insights to share about my journey into the technical territory of Ask Eve AI, affectionately known as Evie.

Choosing the Language: Python

The first step was to choose the programming language for developing Evie. Python emerged as the clear choice for several reasons:

  • Industry Standard for AI: Python is the go-to language for data and AI-based solutions.
  • Hands-on Experience: I had dabbled in various open-source generative AI projects like ComfyUI, OpenInterpreter, ChatDev, and InvokeAI, all predominantly using Python.
  • Collaborative Edge: Conversations with former colleagues about Langchain provided an extra push.
  • Meta-Programming Background: My experience with tools like AionDS, ObjectPro, and Ruby made Python a natural fit.

Python offers several advantages:

  • Rich Ecosystem: A plethora of frameworks and libraries, providing immense freedom and flexibility. This might be overwhelming for some, but I enjoy having so many options at my fingertips.
  • Abundant Resources: Extensive online knowledge base for easy problem-solving. It's as simple as searching and finding solutions.
  • LLM Compatibility: Proficiency with large language models, as highlighted by Mistral AI :

Performance accuracy on code generation benchmarks (Mistral.ai)

Assembling the Technical Components

With Python chosen, the next step was selecting the technical components. Drawing from my extensive experience with open-source systems, I opted for PostgreSQL with the pgVector extension for a hybrid relational and vector database, nginx for routing and serving static files, and Redis as a message broker and for caching. These choices have proven solid.

For Python components, I selected:

  • Flask: As the application framework, offering flexibility and numerous extensions, such as authentication, mail, and CORS. I even created some extensions myself, e.g., to support MinIO throughout the application.
  • Celery: For asynchronous task handling and scalability.

I didn't choose all the components upfront, though. The journey itself helped shape the final stack.

Implementing the RAG System

EveAI implements a Retrieval-Augmented Generation (RAG) system to enhance customer experience with natural language chat, particularly tailored for SMEs. This system integrates two crucial frameworks:

  • Langchain: For the necessary LLM functionality.
  • Flask: For the administration app, managing users, documents, and interactions.

The Flask implementation isn't standard. It e.g. incorporates multi-tenancy, utilizing PostgreSQL for the database backend, with a schema for each tenant, defined and managed using Flask-SQLAlchemy and Flask-Migrate. This ensures a scalable and flexible architecture capable of handling diverse client requirements.

Choosing AI Systems

The fast-paced advancements in AI made it crucial to keep this part highly configurable—on the embedding side (indexing content), the algorithm side, and the chat side. I’ll come back to this later. This flexibility has already paid off. While developing Evie, many frontier models have emerged, like OpenAI’s GPT-4o and GPT-4o mini, Anthropic’s Claude 3.5 Sonnet, and Mistral’s Mistral Large 2, all of which are being integrated into Evie. There’s also a new open source frontier model, Llama 3.1 (405B, 70B, and 8B), which I’m eager to try out, as these may offer additional advantages, such as fine-tuning the models using LoRAs—something I'm familiar with from using similar technology on Stable Diffusion models.

Supporting European Tech

Supporting European initiatives is vital to reduce dependency on overseas technology. Using European technology is essential for this technology to blossom.

On the AI side, while I initially used Mistral, its early versions lacked sufficient proficiency in Dutch (I live in Belgium) and tool support for building Evie. However, the latest release has significantly improved, particularly in supporting Dutch and function calling. So I’ll be integrating Mistral AI pretty soon.

I also chose Stackhero as my hosting provider for these reasons:

  • European Compliance: Adherence to regulations.
  • Minimal Lock-in: Utilisation of open-source tools.
  • Seamless Transition: Ease of transferring between development and production environments.

Learning from Mistakes

Mistakes were made, but they provided valuable learning experiences. I hope they can be valuable to you too:

  • Homebrew Hiccups / Python Version Dependency: An upgrade in Homebrew caused issues with a Python library, making the application unusable. To solve the issue, I opted for a more solid solution by containerising the application. This took quite some time and effort, while no real functionality was being added.
  • File Handling Challenges: When creating an app for local usage, using local file systems is fine. However, for a SaaS solution, this is no longer viable. So I had to integrate MinIO to overcome issues with creating a large repository of information and making it cloud-ready.
  • PDF Handling: A commercial library fell short, and smart prompting with LLMs proved way more effective (wonderful, isn’t it? ??).

Ask Eve AI: Leveraging AI Technology

Evie heavily utilises the latest AI technology. The most obvious areas are:

  • Embedding Models: For understanding and indexing content.
  • AI-Driven Interactions: Facilitating user chats.

Behind the scenes, however, Evie employs about 10 additional prompts for various tasks, such as processing PDF files. These algorithms will continue to expand as Evie evolves. I see Evie as a hybrid between classical software engineering and prompt engineering for that reason.

Creating Evie also involved extensive use of AI. LLMs like GPT-4 and Claude Sonnet 3.5 have been indispensable in developing Evie, offering expertise in Python, Docker, SQL, bug-fixing, and idea generation. They truly are the dream team members every developer wishes for.

This trend of smaller companies creating great products will only grow stronger as LLMs become more agentic, driving rapid innovation and growth. But that may be some stuff for a next TOTW.

Godwin Josh

Co-Founder of Altrosyn and DIrector at CDTECH | Inventor | Manufacturer

1 个月

It's fascinating to see how Evie is evolving! Building an AI like that is like training a digital puppy lots of patience, code, and the occasional chewed-up database. With all this talk about AI taking over, I wonder if Evie will start writing its own blog posts... maybe even with better hashtags?

要查看或添加评论,请登录

社区洞察

其他会员也浏览了