Securing Your Data: The Ultimate Local LLM Setup Guide
Rajat Singh
Lead Developer @Arrow Electronics. Developing and Enhancing SAP E-Commerce(Hybris) Applications. Passionate Coder/Thinker. Lets Connect!
I had already integrated ChatGPT into my daily activities, but I was always cautious about sharing private data. I wanted a solution that would allow me to run a large language model (LLM) securely in my local environment so that I could have a ...
Before we go ahead let's Understand LLM(Large Language Model)
Think of LLMs as large files that are executed using Python scripts and require significant computing power to run and process effectively.
Another way to visualize LLMs is by comparing them to operating systems. Just like operating systems manage multiple tasks simultaneously, LLMs need lots of parallel processing to perform efficiently. This is where GPUs (Graphics Processing Units) come in. GPUS are designed for parallel processing and are the ideal choice for running LLMs smoothly.
If you're curious about the specific hardware needed to run LLMs, there are many detailed guides available online. but you can refer below table for simplicity...
Now, let’s move on to setting up a large language model on your local computer. While there are multiple ways to do this, I’ll focus on the two methods that are the most reliable and well-known.
LM Studio
The setup process of LM Studio is straightforward. Start by downloading the required software from their official website. Once installed, you'll find an interface that allows you to explore various models.
There are two LLM file formats to choose from: GGUF and MLX. The GGUF format is a better option as it is compressed and optimized, making it suitable for running on devices with limited hardware resources.
now the question is which model you can choose from?
you can see the ranking of the open-source model to get an idea of users' preferences on https://lmarena.ai/ and you can search it in the LM Studio to download it.
you can also filter through these models directly in the LM Studio based on most liked, Most Downloads, etc.
most of these models are censored but things get interesting when you can also download the uncensored models.
you can read through this blog https://erichartford.com/uncensored-models
领英推荐
Warning: Do not use these models for any unethical use.
After you download it can go ahead and load it for chat.
Ollama
Ollama is another Open source project that you can set up on your local machine to execute the large language model.
you can follow the below setup guide on Git Hub to download the installed based on the operating system, you can even download the docker image to get started as a docker container.
GitHub URL:
after installing it you can download/run a model by executing the below command
ollama run llama3.2
it will start downloading the LLM if not there already and you will be able to see the progress like below..
you can download multiple models ...
here, you can chat directly in the terminal.
>>> write a python script to read a csv file
Here is a simple Python script using the `pandas` library to read a CSV file:
```python
import pandas as pd
def read_csv(file_path):
try:
# Read the CSV file
df = pd.read_csv(file_path)
# Print the first few rows of the DataFrame
print(df.head())
# Return the DataFrame
return df
except FileNotFoundError:
print("The file was not found.")
except pd.errors.EmptyDataError:
print("The file is empty.")
except pd.errors.ParserError as e:
print(f"An error occurred while parsing the file: {e}")
except Exception as e:
print(f"An unexpected error occurred: {e}")
# Usage
file_path = 'example.csv' # Replace with your CSV file path
df = read_csv(file_path)
```
This script defines a function `read_csv` that takes the path to a CSV file, attempts to read it, and returns the DataFrame. It includes
error handling for common exceptions that may occur when reading a CSV file.
Alternatively, you can use the `csv` library without pandas:
```python
import csv
def read_csv(file_path):
try:
with open(file_path, 'r') as file:
reader = csv.reader(file)
# Get all rows from the CSV file
data = list(reader)
# Print the first few rows of the data
for row in data[:5]:
print(row)
return data
except FileNotFoundError:
print("The file was not found.")
except Exception as e:
print(f"An unexpected error occurred: {e}")
# Usage
file_path = 'example.csv' # Replace with your CSV file path
data = read_csv(file_path)
```
This script defines a function `read_csv` that takes the path to a CSV file, attempts to open it and read its contents, and returns the
data. It includes error handling for common exceptions that may occur when reading a CSV file.
Some of the important commands to note
# Get out of
>>> /bye
# remove a model
ollama rm llama3.2
# copy a model to create your own
ollama cp llama3.2 my-model
# you can list all the available and downloaded models
ollama list
Example
NAME ID SIZE MODIFIED
llama3.2:latest a80c4f17acd5 2.0 GB 14 minutes ago
#loaded models
ollama ps
#stoping a loaded model
ollama stop llama3.2
Thank you for reading.