This was counter-intuitive to me too! I was recently playing around with some of the LLMs that can run on consumer hardware (via KoboldAI, RWKV, etc) and, boy, are they not as good as ChatGPT despite consuming all my Mac's resources. Meanwhile, can get Stable Diffusion images in under a minute!
No comments yet.