top | item 44973366

(no title)

vitorbaptistaa | 6 months ago

That's very interesting, although I'm still curious about the training resource usage -- not "only" inference. I wonder what is the relative importance of training (i.e., what percentage of the resource usage was in training vs. inference)

discuss

order

cillian64|6 months ago

One random preprint I found (https://arxiv.org/html/2505.09598v2) estimates inference is 90% of the total energy usage. From some googling around, everything seems to agree that inference dominates (or is at least comparable to) training on the large commonly-used models.

I was less surprised that inference dominates training after I read that chatgpt is serving billions of requests per day.

jeffbee|6 months ago

Many papers on large models make claims about the energy used. Google has consistently said that training and inference use about the same fraction of their data center resources.

IAmBroom|6 months ago

Same could be said for anything: human operator, google search algos pre-AI (where indexing = training), etc.

beepbooptheory|6 months ago

Is this just meant to be dismissive, or just like a kind of non-answer? That it could be said for anything doesn't make asking it of this specific thing unimportant or uninteresting.

add-sub-mul-div|6 months ago

Right, and the scraping, the extra storage required, the manufacture of all the extra GPUs, etc. This is them hoping people don't understand that the query is only one part of it and aren't curious enough to ask further questions.

xnx|6 months ago

It's rare to see this type of calculation anywhere, though I wish it weren't.

A miles per gallon number for a car doesn't count the diesel that went into the equipment to mine the ore to make the steel for the chassis, etc.