top | item 42996172

(no title)

wwwtyro | 1 year ago

> Moore waited at least five years [1] before deriving his law.

OpenAI has been around since 2015. Even if we give them four years to ramp up, that's still five years worth of data. If you're referring to the example he gave of token cost, that could just be him pulling two points off his data set to serve as an example. I don't know that's the case, of course, but I don't see anything in his text that contradicts the point.

> I don't think that it makes much sense to compare commercial pricing schemes to technical advancements.

How about Kurzweil's plot [1]?

[1] https://scx2.b-cdn.net/gfx/news/hires/2011/kurzweilfig.1.jpg

discuss

order

tim333|1 year ago

That Kurzweil plot is a bit ancient, up to 1998 or something

There's a better one that goes to 2023 https://www.bvp.com/assets/uploads/2024/03/Price_Computation...

The rate of progress is more like the Moore's law 18 month doubling. That's compute per dollar rather than Moore's transistor density.

I think 10x per year is a bit questionable - it's way out of line with the Kurzweil trend.

LegionMammal978|1 year ago

Yeah, price performance definitely seems to be the more important metric here. Anyone can get more compute by building a bigger and more expensive chip, but per-dollar metrics can't be gamed so easily. Though even in that plot, it's only doubled every ~2.3 years since 2008.

(I'd be especially interested in amortized price performance, i.e., the number of useful computations from a system over its lifetime, divided by the total cost to build, maintain, and operate it. That's going to be the ultimate constraint on what you can do with a given amount of funding.)