top | item 40244520

(no title)

unsatchmo | 1 year ago

Mobile devices don’t have the memory or the memory bandwidth to run an LLM that’s big enough to be good at much. Plus the fixed battery and thermal constraints.

discuss

order

bendews|1 year ago

my M1 Macbook air can run LLM's pretty well.. worse specs than the latest iPad Pro (and iPhone pro wouldn't be too far behind).

nkozyra|1 year ago

Running them is a whole lot less resource intensive than training them.

Unless the plan was just to build a RAG source from your personal data, in which case it would be yet another underwhelming feature.

ugh123|1 year ago

I guess that will never change huh