top | item 35606573

(no title)

DigitalDopamine | 2 years ago

Renting 40 nvidia a100s is around 70k dolar per month (on vultr i see). So this would only cost 420k for 6 months. Seems doable.

Is 40 a100s enough though? I am interested in what this would cost.

discuss

order

nl|2 years ago

LLama 65B used 2048 80G A100s for 21 days:

> When training a 65B-parameter model, our code processes around 380 tokens/sec/GPU on 2048 A100 GPU with 80GB of RAM.[1]

Note that you probably need to budget for double to triple that because things go wrong and it usually takes multiple starts to get a good training run.

Smaller models are cheaper though.

[1] https://arxiv.org/pdf/2302.13971.pdf

DesiLurker|2 years ago

it pains me to see AMD just sitting on their asses through this incredible development of AI & possibly AGI. if they still cant get their shit together then they should spin-off the discrete gpu division into something purely compute focused. I believe now there is enough momentum in the AI/ML space to fully develop innovative ideas on h/w front.

mlboss|2 years ago

It would be great if this can be done on 3090s. Used 3090 usually costs $500-1000 to buy.