top | item 40443755

(no title)

sn0wr8ven | 1 year ago

There definitely are smaller LLMs that can run on consumer computers, but as for their performance... You would be lucky to get a full sentence. On the other hand, sending and receiving responses as text is probably the fastest and most realistic way to implement these things in games.

discuss

order

imtringued|1 year ago

I've gone past the 8k context window with very good text generation on llama3. I don't know what you're smoking.