top | item 25608581

(no title)

odnes | 5 years ago

How are they sourcing/funding the compute to train these massive models?

discuss

order

Alvion_Bleeds|5 years ago

Connor Leahy, who I think is a sort of BDFL figure for ElutherAI, mentioned in a Slatestarcodex online meetup I attended that Google donated millions of dollars worth of preemptable TPU credits to the project. There is a video of the meetup on YouTube somewhere. Struck me as a really smart kid with a lot of passion.

chillee|5 years ago

Haha Connor (although one of the main participants) definitely isn't a BDFL - we don't have any BDFLs :)

We don't really have much of a hierarchy at all - it's mostly just a collection of researchers of widely varying backgrounds all interested in ML research.

stellaathena|5 years ago

I'm not sure what a BDFL figure is, but Google does not give us millions of dollars. We are a part of TFRC, a program where researchers and non-profits can borrow TPUs when they're not being used. You could say that we are indirectly funded as a result, but it's nowhere near millions of dollars and it doesn't reflect any kind of special relationship with Google.

leogao|5 years ago

EleutherAI has a very flat hierarchy; we do not have any BDFL-like figure.

luto|5 years ago

they'll probably run it on scientific clusters of various universities, or on collections of idle lab desktop machines. Both of these tend to sit idle a lot of the time, based on my experience at uni in Europe.