Is the Future Open-Source? Ollama on Windows 11
Image generated by Author

Is the Future Open-Source? Ollama on Windows 11

#1 of 51 of awesome use cases of open-source llms by Tushar Aggarwal


In recent times, with the advancement of open-source technologies, tools like Ollama and LM Studio have gained significant popularity among developers and researchers. Ollama, known for its versatility in natural language processing tasks, and LM Studio, a platform for fine-tuning language models, have now made strides by making Ollama compatible with Windows operating systems. This development opens up new avenues for Windows users to harness the power of Ollama for various tasks efficiently.

Contents

1. Local?LLMs

2. About?Ollama

3. Challenges with Local LLMs Controlled from?Ollama

4. Comparison with Cloud-Based Options

5. The?Setup

6. Running Ollama for the first?time

7. Some experiments with Popular?models

8. Using Ollama with?Python


1. Local?LLMs:?

With the rise of language models (LLMs) powered by artificial intelligence, such as OpenAI’s GPT series, the need for effective testing methodologies becomes increasingly apparent. This article delves into the intricacies of testing local LLMs and explores ways to overcome challenges associated with their quality and speed, particularly when controlled from Ollama.

Running LLMs locally, which can be beneficial for several reasons:

Development?—?Quickly iterate locally without needing to deploy model changes.

Privacy and Security?—?Running models locally means your data doesn’t leave your machine, which can be crucial if you’re working with sensitive information.

Cost?—?Depending on the volume of your usage, running models locally could be more cost-effective than making API calls to a cloud service.

Control?—?You have more control over the model and can tweak it as needed.

2. About?Ollama

Ollama is a streamlined tool for running open-source LLMs locally, including Mistral and Llama 2. Ollama bundles model weights, configurations, and datasets into a unified package managed by a Modelfile.

Ollama supports a variety of LLMs including LLaMA-2, uncensored LLaMA, CodeLLaMA, Falcon, Mistral, Vicuna model, WizardCoder, and Wizard uncensored.

Ollama also supports the creation and use of custom models. You can create a model using a Modelfile, which includes passing the model file, creating various layers, writing the weights, and finally, seeing a success message.

Some of the other models available on Ollama include:

  • Llama2: Meta’s foundational “open source” model.
  • Mistral/Mixtral: A 7 billion parameter model fine-tuned on top of the Mistral 7B model using the OpenOrca dataset.
  • Llava: A multimodal model called LLaVA (Large Language and Vision Assistant) which can interpret visual inputs.
  • CodeLlama: A model trained on both code and natural language in English.
  • DeepSeek Coder: Trained from scratch on both 87% code and 13% natural language in English.
  • Meditron: An open-source medical large language model adapted from Llama 2 to the medical domain.

3. Challenges with Local LLMs Controlled from?Ollama

Local LLMs controlled from Ollama provide a convenient means for developers to experiment and iterate on their models without relying on external resources. However, these localized setups often face limitations in terms of both quality and speed compared to their cloud-based counterparts. The constrained computational resources and lack of optimization inherent in local setups can hinder the overall testing experience. For this blog I am using a 16-GB Ram with Intel i5 system. General rule you can follow for Memory requirements is if you are using a 7b models at least 8GB of RAM should be available.

4. Comparison with Cloud-Based Options

In contrast to local setups, cloud-based options offer several advantages in terms of scalability, performance, and accessibility. By leveraging the vast computational power and infrastructure provided by cloud platforms, developers can conduct testing procedures more efficiently and effectively. Additionally, cloud-based setups facilitate seamless collaboration and integration with other development tools and services. So use Ollama for testing and local expirementing.

5. The?Setup

Here comes the good part:

  1. Download Ollama from the official website. (It will auto select to Windows (preview) version).?

Snapshot by Author

2. After downloading, the installation process is straightforward and similar to other software installations. (Currently you can not select destination folder so make space accordingly for you models)

3. Once installed, Ollama creates an API where it serves the model, allowing users to interact with the model directly from their local machine.

6. Running Ollama for the first?time

Running models using Ollama is a simple process. Users can download and run models using the run command in the terminal. If the model is not installed, Ollama will automatically download it first. For example, to run the Code Llama model, you would use the command ollama run llama2.

Here what it will look like:

Snapshot by Author

When its ready it will be something like this:

Snapshot by Author

7. Some experiments with Popular?models

For obvious reason, llama2 is most popular model pulled (430.2K Pulls as of now), here how it should look like:

Snapshot by Author

Now switching to PowerShell

Here is the sample prompt I used ask about “APIs”:

Snapshot by Author

No internet is connected to produce this output, hence the purpose of local llms served.?

Important to note speed of output will depend on windows system and model used. Please consider Ollama website to refer requirements about RAM in accordance with your choice of models.


Now lets use ollama run codellama:7b:

Snapshot by Author

Here is the sample prompt I used code codellama:7b ?:

"You are an expert programmer that writes simple, concise code and explanations. Write a python function to generate the nth fibonacci number."        
Snapshot by Author

You can also experiments with Open-WebUI

https://github.com/open-webui/open-webui/

8. Using Ollama with?Python

You can also use Ollama with Python. LiteLLM is a Python library that provides a unified interface to interact with various LLMs, including those run by Ollama.

To use Ollama with LiteLLM, you first need to ensure that your Ollama server is running. Then, you can use the litellm.completion function to make requests to the server. Here's an example of how to do this:

from litellm import completion

response = completion(
model="ollama/llama2",
messages=[{ "content": "What you can even do with an API?", "role": "user"}],
api_base="https://localhost:11434"
)

print(response)        

In this example, ollama/llama2 is the model being used, and the messages parameter contains the input for the model. The api_base parameter is the address of the Ollama server.

Moreover, LiteLLM’s unified interface allows you to switch between different LLM providers easily, which can be useful if you want to compare the performance of different models or if you have specific models that you prefer for certain tasks.

In this example, base_url is the URL where Ollama is serving the model (by default, this is https://localhost:11434), and model is the name of the model you want to use (in this case, llama2).

FAQs

1.Can I run Ollama on older versions of Windows?

Ollama is designed to be compatible with Windows version(10+)?. However, it’s recommended to use the latest supported version for optimal performance and security and currently its on Preview version.

2. Does running Ollama on Windows require a powerful hardware configuration?

While Ollama can leverage hardware acceleration for enhanced performance, it is designed to run efficiently on a variety of hardware configurations. Users with older or less powerful systems can still benefit from Ollama’s capabilities.

3. Can I fine-tune language models using LM Studio on Windows?

Yes, LM Studio is fully compatible with Ollama on Windows, allowing users to fine-tune language models for specific tasks or domains directly within the Windows environment.

4. Is Ollama on Windows free to use?

Yes, Ollama is available as a free download for Windows users.?

5. Are there any limitations to running Ollama on Windows compared to other platforms?

Ollama on Windows offers the same core functionality and capabilities as on other platforms. However, users may encounter minor differences in performance or compatibility based on their specific hardware and software configurations.

Conclusion

In conclusion, while testing local LLMs controlled from Ollama provides a convenient solution for development and experimentation, developers must be cognizant of the associated challenges in terms of quality and speed. By leveraging cloud-based options and implementing mock frameworks, developers can enhance the efficiency and reliability of their testing procedures, ultimately improving the overall performance of their language models.


Upcoming-Local llms Code assistant for VS-Code with Ollama and more use cases.

Newsletter DataUnboxed

Github?, LinkedIn, X, Hashnode

Since 2022 I developed 300+ production ready applications on using various tools for client’s MVP, for guides and more, & I am going to be sharing guides, community apps on various platforms…(Looking for business or affiliate collaborations? contact me!)


#data #datascience #analytics #machinelearning #datadriven #LinkedInData #knowledgesharing #paris #eu #europe #datajourney #python #aritificialintelligence #nlp #llm #TusharAggarwal #originalcontent #dataanalytics #developer #DataUnboxed #bigdata #statistics #dataengineering datastrategy #DataJobs #DataScience #CareerOpportunities #TechJobs #DataAnalyst #MachineLearning #JobSearch #Community #Innovation Diversity #DataEnthusiasts #hr #hiring #data #datascience #analytics #machinelearning #datadriven #LinkedInData #knowledgesharing #paris #eu #europe #datajourney #python #aritificialintelligence #nlp #llm #TusharAggarwal #originalcontent #dataanalytics #developer #DataUnboxed #bigdata #statistics #dataengineering #datastrategy #hireme #hiring #hiringdata #questionforgroup #america #canada #austria #australia #germany #london #kenya #singapore #india #dubai #uae #estonia #denmark #finland #sweden #switzerland #china #thenetherlands #netherlands #italy #belgium #thailand #southkorea #poland #luxembourg #oman #malasiya #job #talent #research #publications #iledefrance #hiring

要查看或添加评论,请登录

Tushar Aggarwal, M.Sc.的更多文章

社区洞察

其他会员也浏览了