Just in case anyone gets tempted, the P106-100's were mining cards and locked to PCIe 1.1, so the bus bandwidth is terrible. Add to that the limited amount of memory (6GB), low memory bandwidth, Pascal (1/64 speed FP16), and that these were all likely being run in extremely shoddy data centers, it's not even worth the power costs to run IMO.
For those looking for the cheapest higher memory solutions, 24GB P40s are available (decent amount of VRAM, 3X the memory bandwidth, but requires server or DIY cooling, same bad FP16) or IMO the best bang/buck for hobbyists/home devs atm, used RTX 3090s are going for about $600-700 each.
(Note: if you're doing training, unless you have very high utilization/already calculated your costs, you will probably be much better off renting cloud GPUs from Vast.ai, RunPod, etc)
Using a full-blown GPU just for neural network inference is crazy inefficient. They should hire some blockchain dudes to build them custom hardware for one tenth of the price.
krasin|2 years ago
On commodity markets. GPUs are essentially a monopoly, so gluts are only really possible with outdated hardware.
My favorite eBay listing of today is a lot of 100 NVIDIA P106-100 6GB GPUs for ~$20 each: https://www.ebay.com/itm/305023042595
lhl|2 years ago
For those looking for the cheapest higher memory solutions, 24GB P40s are available (decent amount of VRAM, 3X the memory bandwidth, but requires server or DIY cooling, same bad FP16) or IMO the best bang/buck for hobbyists/home devs atm, used RTX 3090s are going for about $600-700 each.
(Note: if you're doing training, unless you have very high utilization/already calculated your costs, you will probably be much better off renting cloud GPUs from Vast.ai, RunPod, etc)
otabdeveloper4|2 years ago
zaphirplane|2 years ago