Can someone elaborate on the meaning of "7m model"?
I'm new to AI, and had an LLM spit out an explanation of why some of the "local" models don't work in Ollama on my Air, but... I don't know how accurate the AI is, heh.
It's my understanding most models are more like 1-30b (as in Billion)
They have just four small layers, rather than several dozen large layers. Off the top of my head, Gemma 3 27B has 63 layers or so. They're also larger since it has a much larger number of embedding dimensions.
Hence they end up with ~7 million weights or parameters, rather than billions.
[+] [-] magicalhippo|5 months ago|reply
https://news.ycombinator.com/item?id=45506268 Less is more: Recursive reasoning with tiny networks (54 comments)
[+] [-] firefax|5 months ago|reply
I'm new to AI, and had an LLM spit out an explanation of why some of the "local" models don't work in Ollama on my Air, but... I don't know how accurate the AI is, heh.
It's my understanding most models are more like 1-30b (as in Billion)
[+] [-] magicalhippo|5 months ago|reply
Hence they end up with ~7 million weights or parameters, rather than billions.
[+] [-] p1esk|5 months ago|reply
[+] [-] maccam912|5 months ago|reply
[+] [-] unknown|5 months ago|reply
[deleted]
[+] [-] zwischenzug|5 months ago|reply
[+] [-] iblacksand|5 months ago|reply
[+] [-] ripped_britches|5 months ago|reply
[+] [-] koakuma-chan|5 months ago|reply
[+] [-] byyoung3|5 months ago|reply