top | item 42299641

(no title)

timtom123 | 1 year ago

I'm excited to test the final model. This could be a major breakthrough for open-source LLMs.

discuss

order

arilotter|1 year ago

This specific model is only trained on 100 billion tokens, so it's not SOTA by any means, but we've got designs on larger training runs later :)