5 Essential Free Tools for Getting Started with LLMs
Kshitij Sharma
IEEE Member | CSI Member | AI & ML Engineer | Generative AI, LLMs, NLP, RAG, Computer Vision | Researcher & Developer | Conference Presenter | Open-Source Contributor | Building Intelligent Systems for Healthcare
Introduction
Although large language models (LLMs) are now widely used and helpful for a variety of activities, the sheer volume of LLM tools and utilities may be daunting to novice users. In order to help maturing minds utilize the vast array of accessible language models, this article concentrates on five of the most popular and widely available tools: Transformers, LlamaIndex, Langchain, Ollama, and Llamafile. All of these tools are free of charge.
1. Transformers
The NLP powerhouse Hugging Face is the source of Transformers, one of the most well-known libraries for contemporary NLP model frameworks. Transformers offers a wide range of pre-trained models, including both basic and optimized models for applications like question answering, translation, text classification, and more.
Key Features
Transformers is good for new users, as it is very simple to pick up the basics, but also useful enough to help with even the most complex of tasks. The library comes with extensive documentation, user-friendly APIs, and a nearly-unfathomable collection of available models. With Transformers, beginners can start using state-of-the-art models without a ton of deep learning knowledge.
Getting Started
First, install Transformers:
pip install transformers
Example: Loading a pre-trained model and running inference
from transformers import pipeline
classifier = pipeline("sentiment-analysis")
result = classifier("I love using the Transformers library!")
print(result)
2. LlamaIndex
A data platform specifically designed for LLM use cases, particularly retrieval augmented generation (RAG), is called LlamaIndex. It makes it easier to create complex data-based LLM applications by streamlining connections between LLMs and other data sources.
Key Features
Because it streamlines the initial setup and handles the plumbing needed to link data to the application, LlamaIndex is beneficial for novices. This allows for simple connection with data sources and customization to one's preference. Its thorough documentation makes it easy for developers to get started and steer their apps in the right direction.
Getting Started
First, install the library:
pip install llama-index
Example: Building a very simple RAG application (taken from here)
Note that for this example your OpenAI API key must be set as an environment variable, and that LlamaIndex uses OpenAI’s gpt-3.5-turbo model by default. I am also using a copy of the Machine Learning Mastery ebook “Maximizing Productivity with ChatGPT” as my sole RAG document, located in the “data” directory.
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query("Tell me a little about prompt engineering.")
print(response)
3. Langchain
LangChain is a framework which allows AI engineers to connect language models with a vast array of data sources, as well as with other LLMs. Langchain also provides pathways for context-aware reasoning applications, offering tools for building capable agents that can perform complex reasoning tasks for problem solving.
Key Features
Beginners can use Langchain to quickly build intelligent agents, as it makes application development painless and comes with a robust set of tools and templates to get things moving.
领英推荐
Getting Started
Install Langchain via pip:
pip install langchain
Example: Check out the Langchain quickstart guide for a useful introductory tutorial.
4. Ollama
Multiple LLMs, including Llama 3, Mistral, Gemma, and others, are easily accessible with Ollama, which also reduces deployment and management overhead to make managing them easier. Ollama allows you to swiftly build up local LLMs for development and engagement.
Key Features
Ollama is good for beginners since it brings together a slew of leading large language models, and makes them easier to deploy and run. Get your hands on Llama 3 locally, for example, and then connect to the same model via Ollama in your favorite LLM development framework (Langchain, LlamaIndex, etc.) for development. It really solves multiple problems at once.
Getting Started
Install Ollama via their website for your platform, and then use the Python library to interact:
pip install ollama
Example: Use a model in your own Python application (taken from here)
import ollama
response = ollama.generate(model='gemma:2b', prompt='what is a qubit?')
print(response['response'])
5. Llamafile
Llamafile was created to make it simple to run and share LLMs using a single file. Its technique is clear-cut and easy to use, making the distribution and operation of models comfortable.
Key Features
This tool helps manage LLM assets, which in turn assist with communicating with and running LLMs. Its minimal complexity gives additional ease to newbies.
Getting Started
Use pip to install Llamafile:
pip install llamafile
Example: load and query the Mistral llamafile from the command line
./mistral-7b-instruct-v0.2.Q5_K_M.llamafile --temp 0.3 -p '[INST]what is a qubit?[/INST]'
Summary
To help new users get started with LLMs, we have listed five utilities in this post: Transformers, LlamaIndex, Langchain, Ollama, and Llamafile. Each one provides a distinct collection of activities, benefits, and features designed to help novices understand and engage with the nuances of the LLM development world. These resources offer an excellent starting point for learning about LLMs.
Be sure to visit and reference each of the project’s repositories and documentation to help guide you in your quest to learn and experiment with these tools. Enjoy the process!