Geek Out Time: Creating a Local AI Agent on My Mac Using Autogen Builder with the Local LLM (Microsoft Phi3 model)
(Also on Constellar tech blog: https://medium.com/the-constellar-digital-technology-blog/geek-out-time-creating-a-local-ai-agent-on-my-mac-using-autogen-builder-with-the-local-llm-08862e908443 )
AI has come a long way, and now creating your own AI agent is more accessible than ever with various newly released AI agent frameworks from Microsoft, Google, and OpenAI. With tools like AutoGen Builder (https://microsoft.github.io/autogen/) and powerful local LLMs like Microsoft Phi3, you can build and deploy AI solutions right from your Mac. Let’s do it together — a step-by-step guide to achieve just that.
Step 1: Install AutoGen Builder In my previous post, we played with AutoGen from Microsoft. AutoGen Builder is its portal with a no-code environment for users to easily create workflows, models, and agents. Run the following command to install:
pip install autogenstudio
Once installed, run its web UI in the terminal:
autogenstudio ui --port 8082
It has quite a clean UI.
Step 2: Run the Local LLM with Ollama It is straightforward to run the local LLM with Ollama using the command below:
ollama run phi3
Well, I was overconfident with my M1 MacBook at first. I tried to run Gemma 2 9B from Google and later found my Mac hanging during workflow testing. Therefore, I had to switch to Phi 3 Mini, which is much smaller at 2.3 GB.
Step 3: Run LiteLLM and Gunicorn
AutoGen natively supports LLM models from OpenAI and Gemini, but not Phi3. So we have to use LiteLLM and Gunicorn as a proxy to provide OpenAI-compliant APIs for AutoGen to call. The installation is pretty straightforward:
pip install litellm --upgrade
I ran into the following error:
? litellm --model ollama/gemma2
Traceback (most recent call last):
File "/Users/nedved/.pyenv/versions/3.9.16/lib/python3.9/site-packages/litellm/proxy/proxy_server.py", line 53, in <module>
import backoff
ModuleNotFoundError: No module named 'backoff'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/nedved/.pyenv/versions/3.9.16/bin/litellm", line 8, in <module>
sys.exit(run_server())
File "/Users/nedved/.pyenv/versions/3.9.16/lib/python3.9/site-packages/click/core.py", line 1157, in __call__
return self.main(*args, **kwargs)
File "/Users/nedved/.pyenv/versions/3.9.16/lib/python3.9/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
File "/Users/nedved/.pyenv/versions/3.9.16/lib/python3.9/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/nedved/.pyenv/versions/3.9.16/lib/python3.9/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/Users/nedved/.pyenv/versions/3.9.16/lib/python3.9/site-packages/litellm/proxy/proxy_cli.py", line 262, in run_server
raise e
File "/Users/nedved/.pyenv/versions/3.9.16/lib/python3.9/site-packages/litellm/proxy/proxy_cli.py", line 248, in run_server
from .proxy_server import (
File "/Users/nedved/.pyenv/versions/3.9.16/lib/python3.9/site-packages/litellm/proxy/proxy_server.py", line 59, in <module>
raise ImportError(f"Missing dependency {e}. Run pip install 'litellm[proxy]'")
ImportError: Missing dependency No module named 'backoff'. Run pip install 'litellm[proxy]'
So, I had to run the command to install additional dependencies needed for running the proxy server, including backoff:
领英推荐
pip install 'litellm[proxy]'
Then, run litellm
litellm --model ollama/phi3
You will see the proxy is running on https://0.0.0.0:4000.
Step 4: Configure AutoGen Builder
First, we need to create the test workflow.
The workflow will use the agent “local_assistant”.
Then, we will configure the “local_assistant” agent to use our local LLM Phi3 (local_gemma2 — sorry, I forgot to update the name to local_phi3).
Next, we need to create the model. For the “API key,” I put “NotRequired” as I am calling the local LLM Phi3. Click “Test Model,” and you will see “Model tested successfully” if everything is good.
Finally, go to the “Playground” of AutoGen Builder to run the test. You will see the reply from your local Phi3.
AutoGen Builder is quite easy to use. What could be more interesting for my next step is to create a customized agent with its own skill in AutoGen. It will be fun. Stay tuned !
Roll up your sleeves and try it. Have fun.