top | item 46630207

(no title)

cmpxchg8b | 1 month ago

8GB? What is this, an LLM for ants?

discuss

order

kirurik|1 month ago

You can run some models pretty decently using CPU inference only, things like Gemma 3 that are built for exactly that use case or some tiny speech to text models via llama.cpp that I have tested out (not so good). Although not the best for "heavy" tasks, if you just need a decent text generator that can produce more or less sensible, generic output you are good to go.

matja|1 month ago

It's more about demonstrating what's possible on a Pi than expecting GPT-4 level performance. It's designed for LLMs that specialize in tiny, incredibly specific tasks. Like, "What's the weather in my ant farm?" ;)

The vision processing boost is notable, but not enough to justify the price over existing HATs. The lack of reliable mixed-mode functionality and sparse software support are significant red flags.

(This reply generated by an LLM smaller than 8GB, for ants, using the article and comment as context).

mlvljr|1 month ago

[deleted]