TPUs are like the NPU of the training world. You take a bunch of extra time, money and dedicated silicon and end up with an ASIC that barely competes on equal terms with a similarly priced GPU. Unless you've got access to Nvidia's TSMC supply, you're probably not going to make a dent on their demand.
Additionally - TPUs are completely useless if AI goes out of style, unlike CUDA GPUs. The great thing about Nvidia's hardware right now is that you can truly use the GPU for whatever you want. Maybe AI falls through in 2026, and now those GPUs can be used for protein folding or crypto mining. Maybe crypto mining and protein folding falls through - you can still use most of those GPUs for raster renders and gaming too! TPUs are just TPUs - if AI demand goes away, your dedicated tensor hardware is dead weight.
Also TPU v1,v2 and v3 were ASICs, but since v4 they have added some new features so they have a lower performance/watt which is quite near Nvidia's power draw. I think Hopper is at 700W and TPU are around 600W.
talldayo|1 year ago
Additionally - TPUs are completely useless if AI goes out of style, unlike CUDA GPUs. The great thing about Nvidia's hardware right now is that you can truly use the GPU for whatever you want. Maybe AI falls through in 2026, and now those GPUs can be used for protein folding or crypto mining. Maybe crypto mining and protein folding falls through - you can still use most of those GPUs for raster renders and gaming too! TPUs are just TPUs - if AI demand goes away, your dedicated tensor hardware is dead weight.
ilove196884|1 year ago
ilove196884|1 year ago
unknown|1 year ago
[deleted]