top | item 43368199 (no title) dariusj18 | 11 months ago I've been wanting a local LLM appliance. discuss order hn newest brookst|11 months ago Tech is evolving too quickly; every year the hardware will be much more powerful at the same price (as LLM optimizations reach hardware), so you’d end up replacing the device frequently. unknown|11 months ago [deleted] nextts|11 months ago Not convinced. Are CPUs and GPUs killing it %/$ wise each year like it's 1996?Models are killing it but that is just an "ollama run" command away. load replies (1) readthenotes1|11 months ago Like phones?
brookst|11 months ago Tech is evolving too quickly; every year the hardware will be much more powerful at the same price (as LLM optimizations reach hardware), so you’d end up replacing the device frequently. unknown|11 months ago [deleted] nextts|11 months ago Not convinced. Are CPUs and GPUs killing it %/$ wise each year like it's 1996?Models are killing it but that is just an "ollama run" command away. load replies (1) readthenotes1|11 months ago Like phones?
nextts|11 months ago Not convinced. Are CPUs and GPUs killing it %/$ wise each year like it's 1996?Models are killing it but that is just an "ollama run" command away. load replies (1)
brookst|11 months ago
unknown|11 months ago
[deleted]
nextts|11 months ago
Models are killing it but that is just an "ollama run" command away.
readthenotes1|11 months ago