It’s expensive and it’s good that other options are available, it’s not insanely expensive though, the people that need the upgrade often make 6 figure salaries and can afford the upgrade although they might not like the idea of paying it, I personally hate it but I know when I buy the machine will last me at least 5 years, which is good enough for me.
Kudos|1 year ago
yieldcrv|1 year ago
This is unfortunately too late, to justify recouping costs and buying a new M4 Max.
Primarily because I would be using it for large language model inference at higher parameters. What's happened in just one year is that LLM's have gotten muuuch smaller. Llama 3.2 3B only takes 3GB of RAM at 8 bit quantizing. And in addition to that, cost per token in the cloud has subsequently plummuted 99.9% too. Its just not economical.
But yes, I could afford it but no longer a justification. My 64GB RAM M1 Max is going to be future proof for a while.