top | item 40292662

(no title)

bingbingbing777 | 1 year ago

What AI is Apple scaling?

discuss

order

bionhoward|1 year ago

Seen MLX folks post on X about nice results running local LLMs. https://github.com/ml-explore/mlx

Also, Siri, and consider: you’re scaling AI on apple’s hardware, too, you can develop your own local custom AI on it, there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy.

They scale the VRAM capacity with unified memory and that plus a ton of software is enough to make the Apple stuff plenty competitive with the corresponding NVIDIA stuff for the specific task of running big AI models locally.

Wytwwww|1 year ago

> there’s more memory available for linear algebra in a maxed out MBP than the biggest GPUs you can buy.

But this hardly applies to 95% if not more people of all people running Apple's hardware, the fastest CPU/GPU isn't worth much if you can fit any at least marginally useful LLM model on the 8GB (or less on iPhones/iPads) of memory that you device has?