top | item 45988024

(no title)

apfsx | 3 months ago

Google’s TPU’s are not powering Gemini or whatever X equivalent LLM you want to compare to.

discuss

order

skirmish|3 months ago

I can assure you that most internal ML teams are using TPUs both for training and inference, they are just so much easier to get. Whatever GPUs exist are either reserved for Google Cloud customers, or loaned temporarily to researchers who want to publish easily externally reproducible results.

stingraycharles|3 months ago

They are, even Apple famously uses Google Cloud for their cloud based AI stuff solely because of Apple not wanting to buy NVidia.

Google Cloud does have a lot of NVidia, but that’s for their regular cloud customers, not internal stuff.