Running LLM locally using Java...
There are several options, but looks like #Ollama is getting a lot of traction!
Ollama allows you to download models from a registry and use them locally. The project is open source and available @ https://github.com/jmorganca/ollama
There's also a Java wrapper named Ollama4J (Java 11 and above support) which allows you to integrate it into your Java apps ??
Code available @ https://github.com/amithkoujalgi/ollama4j
Looks like we can even deploy it to Clever Cloud ????
Feedback always welcome.
-Stephan
Senior Solutions Architect with a passion for Machine Learning and Data Science. Author of JOllama Fluent Java API for Ollama
9 个月I have released a new Java Ollama API called JOllama. All Ollama REST endpoints are supported https://www.dhirubhai.net/posts/grahamlynch_jollama-ollama-llm-activity-7212512400149733376-Qb7E?utm_source=share&utm_medium=member_ios
Product Dev | Generalist | Maintainer of Ollama4j
1 年Thanks for sharing this Stephan Janssen! ??
Software Architect at adesso SE, langchain4j Committer
1 年Latest version of #langchain4j supports #Ollama too
CEO at ICTCG
1 年Is it a Java process or a native api wrapped in Java?