top | item 39383836

(no title)

ranulo | 2 years ago

> This new generation also delivers a breakthrough in long-context understanding. We’ve been able to significantly increase the amount of information our models can process — running up to 1 million tokens consistently, achieving the longest context window of any large-scale foundation model yet.

Sweet, this opens up so many possibilities.

discuss

order

No comments yet.