top | item 36093530

(no title)

ajb117 | 2 years ago

I'm pretty sure you can't buy TPUs, but people usually buy GPUs instead. If you're building a personal rig, these days, you can get an Nvidia RTX 3090 for about $720 USD on ebay used, which is pretty cheap for 24GB VRAM. There's also the A6000 with 48GB VRAM but that'll cost about $5000 on Amazon. Of course, there's new cards that are faster with more VRAM like the 4090 and RTX 6000, but they're also more expensive.

Of course, this is all pretty expensive still. If your models are small enough you can get away with even older GPUs with less VRAM like a GTX 1080 Ti. And then of course there's services like Google Collab and vast.ai where you can rent a TPU or GPU in the cloud.

I'd check out Tim Dettmers' guide for buying GPUs: https://timdettmers.com/2023/01/30/which-gpu-for-deep-learni...

discuss

order

jmrm|2 years ago

AFAIK Google Coral is an inexpensive TPU you can buy right now: https://coral.ai/products/accelerator/

kmeisthax|2 years ago

The problem is that this is an "inferencing" accelerator - i.e. it can only execute pretrained models. You cannot train a model on one of these, you need a training accelerator. And pretty much all of those are either NVidia GPUs or cloud-only offerings.

worldsayshi|2 years ago

Very cool! Although seems to have no memory to speak of so many use cases like LLM goes away because of that I guess?