top | item 43022423

(no title)

The_Colonel | 1 year ago

> enabling consumers to run big-LLM inference locally

A non-technical reason is that the market of people wanting to run their personal LLMs at home is very small.

discuss

order

No comments yet.