I decided a Macbook pro M4 pro was the right option for me, 48gb/36 accessible to the gpus with very decent tokens/s throughput for (increasingly impressive) offline midrange open LLM inference and huge battery life. Nothing in the windows world to touch it. But a few windows-only bits of software I occasionally use now run very happily in a Windows 11 arm vm on a free VMware fusion 25H2 on my mac, with a 5 dollar win oem license.
No comments yet.