top | item 44879239

(no title)

Mindless2112 | 6 months ago

Tokenization is an inherent weakness of current LLM design, so it makes sense to compensate for it. Hopefully some day tokenization will no longer be necessary.

discuss

order

No comments yet.