Perhaps give Ramalama a try?
The Hobbyist
Just a stranger trying things.
- 0 Posts
- 4 Comments
Joined 2 years ago
Cake day: July 16th, 2023
You are not logged in. If you use a Fediverse account that is able to follow users, you can follow this user.
Indeed, Ollama is going a shady route. https://github.com/ggml-org/llama.cpp/pull/11016#issuecomment-2599740463
I started playing with Ramalama (the name is a mouthful) and it works great. There is one or two more steps in the setup but I’ve achieved great performance and the project is making good use of standards (OCI, jinja, unmodified llama.cpp, from what I understand).
Go and check it out, they are compatible with models from HF and Ollama too.
The Hobbyist@lemmy.zipto Selfhosted@lemmy.world•Docker Hub limiting unauthenticated users to 10 pulls per hourEnglish0·4 months agoWould you be able to share more info? I remember reading their issues with docker, but I don’t recall reading about whether or what they switched to. What is it now?
Congrats! Amazing project, exciting interface and you went the extra mile on the integration side with third parties. Kudos!
Edit: I’ll definitely have to try it out!