top | item 37841392

(no title)

amasad | 2 years ago

We're on a budget :) trained on 128 H100-80GB GPUs for a week (200B tokens over 5 epochs, ie 1T tokens).

Tech talk here with timestamp: https://www.youtube.com/live/veShHxQYPzo?si=UlcU9j2kC-C4oWvj...

discuss

order

nojvek|2 years ago

Each H100 is ~$30,000, so $3.8M in capex cost.

Roughly $1/hr/GPU in power cost so looking at 128247 = $21,504.

Cheap compared to OpenAI, but not something an indiehacker can do by themselves unless they have millions to burn.