top | item 38544874

(no title)

jefft255 | 2 years ago

A couple millions IIRC. Nothing "large" compared to modern transformer models.

discuss

order

cs702|2 years ago

Thanks for getting back to me. That's what I thought. The magic seems to start happening in the low billions of parameters -- and I say "seems" b/c there's no consensus as to whether it's really truly magic! In any case, it's a shame that most of the human brainpower capable of improving SotA AI doesn't have access to large-scale resources.