top | item 46187136

(no title)

elmomle | 2 months ago

That's a fair point. I suspect that to one outside the field, their touting major breakthroughs while trying to conceal that their first model was a distillation may cause a sense of skepticism as to the quality of their research. From what I've gathered, their research actually has added meaningfully to understandings of optimal model scaling and faster training.

discuss

order

No comments yet.