(no title)
nospice | 1 month ago
At some point, we might end up in a steady state where the models are as good as they can be and the training arms race is over, but we're not there yet.
nospice | 1 month ago
At some point, we might end up in a steady state where the models are as good as they can be and the training arms race is over, but we're not there yet.
Aurornis|1 month ago
Fixed costs can't be rolled into the unit economics because the divisor is continually growing. The marginal costs of each incremental token/query don't depend on the training cost.
AndrewDucker|1 month ago
cortesoft|1 month ago
The training is already done when you make a generative query. No matter how many consumers there are, the cost for training is fixed.
nospice|1 month ago
TSiege|1 month ago
skybrian|1 month ago
robocat|1 month ago
I presume historical internal datasets remain high value, since they might be cleaner (no slop) or maybe unavailable (copyright takedowns) and companies are getting better at hiding their data from spidering.