top | item 34008605

(no title)

bryanh | 3 years ago

https://twitter.com/sherwinwu/status/1603522777968832512 appears to include it.

discuss

order

minimaxir|3 years ago

So it does.

The code there implies cl100k_base has a vocab size of 100k (I guess it's in the name lol) which means it is more comprehensive than GPT-2's 50k, so fewer tokens will be necessary.