top | item 36974525

(no title)

jarenmf | 2 years ago

Which graphics card would you recommend to run Llamma 2 locally? I'm about to buy a laptop and considering choosing a model with a good Nvidia GPU.

discuss

order

jbellis|2 years ago

If you insist on running models locally on a laptop then a Macbook with as much unified ram as you can afford is the only way to get decent amounts of vram.

But you'll save a ton of money (and time from using more capable hardware) if you treat the laptop as a terminal and either buy a desktop or use cloud hardware to run the models.

sundarurfriend|2 years ago

Cloud hardware like? (Is Google Colab the best option, or even one of the best? Is Paperspace Gradient any good? Others?)

brucethemoose2|2 years ago

A 16GB 3080 is probably the cheapest and most ideal in a big laptop.

But you can get some acceleration with anything ~6GB and up.

thangngoc89|2 years ago

Laptop RTX has half the VRAM comparing to their PC counterparts. So 3080 laptop has 8GB

thangngoc89|2 years ago

It’s about VRAM, I would say the more the better, 4060 with 8GB should be the starting point

nickthegreek|2 years ago

3060 with 12gb is cheaper and provides more vram.

tamimio|2 years ago

I had alienware with 3080 16 GB, while it was nice but the laptop is so buggy with all sorts of problems both hardware and software that I sold it at the end, still happy with my MSI Titan, bigger and heavier but overall better experience.

speedgoose|2 years ago

The GPUs with the most VRAM you can justify spending money on.

brucethemoose2|2 years ago

Also, what size and ballpark price are you looking for?