top | item 37440196

(no title)

euclaise | 2 years ago

That tweet had it backwards, more tokens in tokenizer means that the 16k token context window typically allows for even longer passages than if LLaMA were 16k

discuss

order

No comments yet.