(no title)
raajg | 2 years ago
Another tip, I try out a new chat interface to LLMs almost every week and they're free to use initially. There isn't a compelling reason for me to spend $10 from the get to for a use case that I'm not sure about yet.
raajg | 2 years ago
Another tip, I try out a new chat interface to LLMs almost every week and they're free to use initially. There isn't a compelling reason for me to spend $10 from the get to for a use case that I'm not sure about yet.
bradnickel|2 years ago
Decentralized AI will eventually become p2p and swarmed and then the true power of agents and collaboration will soar via AI.
Anyway, excuse the soap box, but there are zero valid reasons for supporting and paying centralized keepers of AI that rarely share, collaborate or give back to the community that made what they have possible.
gverrilla|2 years ago
Is this true? I've tried llama last year and it was not very helpful. GPT4 is already full of problems and I have to keep circumventing them, so using something less capable doesn't get me too excited.
FloorEgg|2 years ago
vunderba|2 years ago
https://github.com/mckaywrigley/chatbot-ui
https://github.com/oobabooga/text-generation-webui
https://github.com/mudler/LocalAI
And then connecting them to off-line models servers:
- Ollama
- llama.cpp
And you should avoid closed source frontends:
- Recurse
- LM Studio
And closed source models
- ChatGPT
- Gemini
ukuina|2 years ago
I'm assuming I cannot block internet access to the app because it needs to verify App Store entitlement.
giblfiz|2 years ago