Use AI's potential - Run a LLM, locally!
Dominik Pollok
Studying Computer Science, Developing Personal Projects and Working Student at dermanostic
ChatGPT, Gemini, and other AI platforms come with two significant disadvantages compared to self-hosted LLMs: security and fine-tuneability.
In this article, I will discuss these two key differences and provide ideas on how you can use your own large language model within your company.
Security
The use of ChatGPT and Gemini is already restricted in some companies due to privacy concerns regarding internal company information. This restriction is understandable, as both OpenAI and Google analyze chat histories to further train and improve artificial intelligence algorithms.
For instance, Samsung banned the use of generative AI tools by their employees after an employee uploaded sensitive code to ChatGPT.
Due to the continuous training of AIs based on conversations, there is always a risk that the data entered will be unintentionally released by the AI.
Additionally, complete prevention of data leaks is impossible, so highly sensitive information should not be shared with these AIs.
Furthermore, it cannot be ruled out that personal data will be transmitted to the AI, which may violate compliance.
How can AI be used without taking a security risk?
The security risks associated with AI usage diminish significantly when an LLM is executed locally and properly configured. By running an AI on your own servers, you have complete control over security and can even run it offline!
This means that you can set up the network infrastructure in such a way that the AI server is only available in your own network without a connection to the Internet. As with other services, employees can then dial into the company network via VPN and access the AI.
Fine-tuneability
ChatGPT is trained by OpenAI, Gemini is trained by Google - why not train your own?
One of the biggest disadvantages of services like ChatGPT and Gemini is that you can't modify them. When it comes to integrating AI into work life, there are many more possibilities than these services can provide. From my point of view, the true potential of artificial intelligence is only revealed when it is tailored to individual needs and purposes.
Here's an example:
You want to integrate a ChatBot on your website to answer customers' questions about your products. ChatBots used to rely on rule-based approach and a hard-coded response, but today some companies are already using AI-based chatbots that have been trained with specific data. It is possible to provide your self-hosted AI with all documents for customer communication and thus create a ChatBot that can answer very specific questions about your product or service.
领英推荐
The performance of a custom AI is comparable to AIs like ChatGPT, thanks to some outstanding open-source LLMs like LLama 2, which was trained by Meta with up to 70B parameters. The availability of pre-trained AIs also means that enormous computing power is no longer required to utilize LLMs.
However, it's important to note that the performance of the AI and its ability to be fine-tuned still depend on the hardware's capabilities.
PrivateGPT
PrivateGPT is a production-ready AI project that allows you to ask questions about your documents using the power of Large Language Models (LLMs), even in scenarios without an Internet connection. 100% private, no data leaves your execution environment at any point. ― privateGPT GitHub page
What is particularly interesting about this open-source project is the API with which you can access the AI. This allows the LLM to be used in other services and scenarios as well, free of charge for commercial use too!
I enjoy using privateGPT with LLama2.
Use cases for your company
What do you think?
Do you already have a self-hosted AI in place?
References: