top | item 44218564

How to run LLMs locally on mobile devices (with Gemma and On-Device AI tools)

1 points| annjose | 8 months ago |annjose.com

1 comment

order

incomingpain|8 months ago

Any models that can run on a mobile device will likely be 8B or smaller, will have very noticable hallucination problems.