top | item 36430855 (no title) julien040 | 2 years ago It's pure speculation, but articles embeddings are computed using 512 tokens, which is roughly equivalent to 400 words. I think that using only one word does not allow the model to fully understand the context. discuss order hn newest No comments yet.
No comments yet.