(no title)
grobbyy | 1 year ago
Next major step up is 48GB and then hundreds of GB. But a lot of ML models target 16-24gb since that's in the grad student price range.
grobbyy | 1 year ago
Next major step up is 48GB and then hundreds of GB. But a lot of ML models target 16-24gb since that's in the grad student price range.
navbaker|1 year ago
bubaumba|1 year ago
from https://www.asacomputers.com/nvidia-l40s-48gb-graphics-card....
nvidia l40s 48gb graphics card Our price: $7,569.10*
Not arguing against 'great', but cost efficiency is questionable. for 10% you can get two used 3090. The good thing about LLMs is they are sequential and should be easily parallelized. Model can be split in several sub-models, by the number of GPUs. Then 2,3,4.. GPUs should improve performance proportionally on big batches, and make it possible to run bigger model on low end hardware.
nickthegreek|1 year ago