top | item 44843274

(no title)

gazarsgo | 6 months ago

I dunno I ran `ollama run gpt-oss:20b` locally and it only used 16GB locally and I had decent enough inference on my Macbook.

discuss

order

latchkey|6 months ago

Now do the 120b model.

Bud|6 months ago

[deleted]