(no title)
fenykep | 2 months ago
[edit] Oh and apparently you can also directly run some models directly from HuggingFace: https://huggingface.co/docs/hub/ollama
fenykep | 2 months ago
[edit] Oh and apparently you can also directly run some models directly from HuggingFace: https://huggingface.co/docs/hub/ollama
ashirviskas|2 months ago
If you've ever used a terminal, use llama.cpp. You can also directly run models from llama.cpp afaik.
fenykep|2 months ago