top | item 46852005

(no title)

crystaln | 29 days ago

Seems much more likely the cost will go down 99%. With open source models and architectural innovations, something like Claude will run on a local machine for free.

discuss

order

walterbell|29 days ago

How much RAM and SSD will be needed by future local inference, to be competitive with present cloud inference?