top | item 44081433

(no title)

srikz | 9 months ago

I want to see more models that can be streamed to a browser and run locally via wasm. That would be my hope for small models. In the <100mb range.

discuss

order

firejake308|9 months ago

After experimenting with 1B models, I am starting to think that any model with 1B parameters or less will probably lack a lot of the general intelligence that we observe in the frontier models, because it seems physically impossible to encode that much information into so few parameters. I believe that in the range of very small models, the winner will be models that are fine tuned to a small range of tasks or domains, such as a model that can translate between English and any other language, or a legal summarization model, etc.

relaxing|9 months ago

Why? Just so user data stays local?

dainiusse|9 months ago

Yes. And also, cost to run it.