WingNews logo WingNews
top | new | best | ask | show | jobs | submit
GitHub [2]
top | item 38336362

Running Hugging Face GGUF Models Locally with Ollama [video]

3 points| jexp | 2 years ago |youtube.com | reply

1 comment

order
[+] [-] jexp|2 years ago|reply
Quick 5 minute video on downloading and running Hugging Face language models in GGUF format (quantized by TheBloke) with Ollama on your local machine and checking GPU consumption with asitop (Apple Silicon Mac Top).
powered by hn/api // news.ycombinator.com