top | item 38540630

(no title)

apstats | 2 years ago

This is really cool. I wonder how long it will be till we have GPT-4 quality models that run locally (if we ever will). Would open up a lot of possibilities.

discuss

order

compinter|2 years ago

We aren't quite there yet, but the last year has been an incredibly exciting time for the open source LLM community. If your computer is decently powerful, you might be really surprised by what's already possible. LM Studio on Apple Silicon supports GGUF quants on GPU out the box.