top | item 40801068

(no title)

jrmylee | 1 year ago

this is true. However I would my friends and I only have Macbooks so for us we wanted something that ran on the cloud

discuss

order

spmurrayzzz|1 year ago

Are you saying that the torch performance with MPS support enabled didn't meet your performance expectations? Or are you using an intel macbook and/or one with a tiny amount of ram/vram?

jokethrowaway|1 year ago

The M1 was released in 2020, chances are by now a macbook has ARM and neural cores which are ok to do local inference

AuryGlenz|1 year ago

They’re OK, but certainly a lot slower than with an Nvidia GPU.

nox101|1 year ago

Modern MacBooks are great at this stuff since they have unified memory and the GPU can use all of it