top | item 47113741

(no title)

yunohn | 8 days ago

IME llama et all require LoRA or fine-tuning to be usable. That's their real value vs closed source massive models, and their small size makes this possible, appealing, and doable on a recurring basis as things evolve. Again, rendering ASICs useless.

discuss

order

fxnn|8 days ago

Read the blog post. It mentions that their chip has a small SRAM which can store LoRA.

yunohn|8 days ago

Neither the blog nor Taalas' original post specify what speed to expect when using the SRAM in conjunction with the baked-in weights? To be taken seriously, that is really necessary to explain in detail, than a passing mention.