top | item 40550100

(no title)

apike | 1 year ago

Yeah this is an interesting point. Other threads make the point about the "bitter lesson", and how expert-trained ML has historically not scaled, and human-generated LLM training data may just be repeating that dead end. Maybe so.

Something that is new this time around, AFAIK, is that we haven’t previously had general ML systems that businesses and consumers are paying billions of dollars a year to use. So if, say, 10% of revenue goes back in to making better data sets every year, I can imagine continued improvement on certain economically valuable use cases – though likely with diminishing returns.

discuss

order

No comments yet.