top | item 40007600

(no title)

Loveaway | 1 year ago

Hope they can deliver. Right now Apple hardware is silly compared to PC+Nvidia if you wanna play around with GenAI. Both in price and performance. Worst case macs end up as thin-clients, all AI running on Nvidia in the cloud. Would eat into their competive advantage a lot I think.

discuss

order

wmf|1 year ago

Right now Apple hardware is silly compared to PC+Nvidia if you wanna play around with GenAI.

You mean it's silly how far ahead Apple is since they offer 192 GB of VRAM while Nvidia only allows 24 GB for reasonable prices? Or do you mean it's silly to compare <$10K Macs with >$30K Nvidia setups in the first place?

modeless|1 year ago

What would be silly is running a 192 GB ML model on a chip that slow. In practically every case you would be better off with a multi-GPU PC or a cloud GPU instance, simply because the performance gap is so enormously massive. You can buy a whole lot of cloud GPU hours for the price of 192 GB in a Mac, especially when you consider that you don't have to pay extra for the electricity and you don't need the very latest chips to far outperform Apple's best.

oidar|1 year ago

If I wanted to build a PC today that could run the big models that were released recently (For example, Mixtral 8x22B and Command-R with as little quantization as possible) what would I buy?