top | item 43760256

Ask HN: What tiny LLMs are you getting the best results from?

4 points| chrisrodrigue | 10 months ago

Curious if anyone here is having any success with running smaller LLMs locally on constrained hardware, such as laptops or GPU-less devices. If so, what kind of utility have they brought you?

1 comment

order

Uzmanali|10 months ago

I’ve had solid luck with TinyLlama and Phi-2 on my MacBook Air (no GPU). It's great for quick drafts, note summaries, and basic Q&A. No internet needed, so it’s super handy when traveling.