(no title)
nborwankar | 1 year ago
Private data especially in the enterprise cannot use public LLM’s like GPT-4 or 5 or N. Use cases needing data privacy have to use an internally implemented LLM application. In Currently, RAG is a concrete and pragmatic enterprise use of LLM’s aside from summarization, which is not amenable to using GPT-4.
GPT-5 may very well be amazing. But unless it runs on-prem it can’t be used in many scenarios because of data privacy.
To the OP - learning how to run LLM’s locally via say Ollama (see ollama.ai) will get you started in a hands on manner. See the /r/LocaLlama subreddit for a very active community around running LLM’s locally.
p1esk|1 year ago
Edit: I’m assuming the scenario where you do want to use the best model.