One advantage that I have with the M1 Pro 32 GB RAM over my gaming desktop is that I'm able to run large ML models such as Bloom, Whisper, and Stable Diffusion with reasonable performance.
How are you running them so well? What do you use for your device since CUDA is obviously not supported and 'mps' is not very impressive, compared it to just about any NVidia GPU including the aging 1080ti[1]
Out of curiosity what are your desktop specs? I have stable diffusion running quite well on a few different systems of varying spec. That said, it's great to be able to take it on the road with you.
abraxas|3 years ago
[1] https://sebastianraschka.com/blog/2022/pytorch-m1-gpu.html
vanilla-latte|3 years ago
ganoushoreilly|3 years ago
vanilla-latte|3 years ago
The limiting factor was the VRAM. The M1 has the ability to use more VRAM than your typical gaming GPU because of the shared RAM.
I see this as a big advantage.