top | item 47116462

(no title)

nacs | 7 days ago

> I really hope more people realize that local LLMs are where it's at

No worries, the AI companites thought ahead - by sending GPU, RAM, and now even harddrive prices through the roof, you won't have a computer to run a local model.

discuss

order

No comments yet.