top | item 42423433

(no title)

DerSaidin | 1 year ago

> Unlike tokenization, BLT has no fixed vocabulary for patches.

iiuc this means: the vocabulary of patches is not known prior to training.

I guess once training has established a vocabulary of patches, that same fixed vocabulary is used for inference (if this is not true I don't see how it could work).

Right?

discuss

order

No comments yet.