top | item 46684777

(no title)

aziis98 | 1 month ago

I hope we get to good A1B models as I'm currently GPU poor and can only do inference on CPU for now

discuss

order

yowlingcat|1 month ago

It may be worth taking a look at LFM [1]. I haven't had the need to use it so far (running on Apple silicon on a day to day basis so my dailies are usually the 30B+ MoEs) but I've heard good things from the internet from folks using it as a daily on their phones. YMMV.

[1] https://huggingface.co/LiquidAI/LFM2.5-1.2B-Instruct