(no title)
Gbox4 | 1 year ago
"On CoreWeave, renting an Nvidia A100 40GB — one popular choice for model training and inferencing — costs $2.39 per hour, which works out to $1,200 per month. On Azure, the same GPU costs $3.40 per hour, or $2,482 per month; on Google Cloud, it’s $3.67 per hour, or $2,682 per month."
Am I missing something? I am sure I'm a bit rusty in math, but I can still handle a calculator. ~720 hours in a month (roughly), and that means CoreWeave would cost $1,720.80 per month, Azure is $2,448 per month, and Google Cloud is $2,642.40 per month.
Why are all of these numbers reported in the article off? Some slightly--Azure and Google Cloud are close, but CoreWeave is off by about 30%. I won't go further into the numbers as to why the author came up with these results, but I'm just wondering if this article was written by AI, which would explain why basic multiplication is incorrect.
coffeebeqn|1 year ago
fancyfredbot|1 year ago
That's $1483.20 a month, whereas the article says $1200 and should say $1720 if they'd got the maths right.
programjames|1 year ago
jmgao|1 year ago
umeshunni|1 year ago
spacebanana7|1 year ago
Onawa|1 year ago
Thrymr|1 year ago
unknown|1 year ago
[deleted]