(no title)
wwwtyro | 1 year ago
OpenAI has been around since 2015. Even if we give them four years to ramp up, that's still five years worth of data. If you're referring to the example he gave of token cost, that could just be him pulling two points off his data set to serve as an example. I don't know that's the case, of course, but I don't see anything in his text that contradicts the point.
> I don't think that it makes much sense to compare commercial pricing schemes to technical advancements.
How about Kurzweil's plot [1]?
[1] https://scx2.b-cdn.net/gfx/news/hires/2011/kurzweilfig.1.jpg
tim333|1 year ago
There's a better one that goes to 2023 https://www.bvp.com/assets/uploads/2024/03/Price_Computation...
The rate of progress is more like the Moore's law 18 month doubling. That's compute per dollar rather than Moore's transistor density.
I think 10x per year is a bit questionable - it's way out of line with the Kurzweil trend.
LegionMammal978|1 year ago
(I'd be especially interested in amortized price performance, i.e., the number of useful computations from a system over its lifetime, divided by the total cost to build, maintain, and operate it. That's going to be the ultimate constraint on what you can do with a given amount of funding.)