top | item 40277832

(no title)

Gbox4 | 1 year ago

A funny comment from that article:

"On CoreWeave, renting an Nvidia A100 40GB — one popular choice for model training and inferencing — costs $2.39 per hour, which works out to $1,200 per month. On Azure, the same GPU costs $3.40 per hour, or $2,482 per month; on Google Cloud, it’s $3.67 per hour, or $2,682 per month."

Am I missing something? I am sure I'm a bit rusty in math, but I can still handle a calculator. ~720 hours in a month (roughly), and that means CoreWeave would cost $1,720.80 per month, Azure is $2,448 per month, and Google Cloud is $2,642.40 per month.

Why are all of these numbers reported in the article off? Some slightly--Azure and Google Cloud are close, but CoreWeave is off by about 30%. I won't go further into the numbers as to why the author came up with these results, but I'm just wondering if this article was written by AI, which would explain why basic multiplication is incorrect.

discuss

order

coffeebeqn|1 year ago

Have you ever asked a LLM to calculate costs for you? This is it exactly what it looks like

fancyfredbot|1 year ago

The whole thing is nuts. They have the wrong costs multiplied by the wrong time period to get the wrong answers. https://coreweave.com/gpu-cloud-pricing says an A100 40GB NVLink is $2.06 whereas the article says $2.39.

That's $1483.20 a month, whereas the article says $1200 and should say $1720 if they'd got the maths right.

programjames|1 year ago

Maybe an LLM helped with the math?

jmgao|1 year ago

The CoreWeave number is completely wrong, but Azure and Google Cloud appear to be exactly correct at 730 hours per month, which happens to be the number of hours in 365 / 12 days.

umeshunni|1 year ago

My favorite quote about journalists goes something like "Never trust a journalist's math. If they could do math, they wouldn't have become journalists."

spacebanana7|1 year ago

Could it be a discount for purchasing an entire month’s worth of capacity? Even if so, such costing plans should be explicit in the article

Onawa|1 year ago

I think your guess of AI generation makes sense for the math discrepancy.

Thrymr|1 year ago

Wonderful that we have evolved large linear algebra models running on expensive computers to the point that they can no longer do basic arithmetic correctly.