Curious if anyone here is having any success with running smaller LLMs locally on constrained hardware, such as laptops or GPU-less devices. If so, what kind of utility have they brought you?
I’ve had solid luck with TinyLlama and Phi-2 on my MacBook Air (no GPU). It's great for quick drafts, note summaries, and basic Q&A. No internet needed, so it’s super handy when traveling.
Uzmanali|10 months ago