top | item 46208493 (no title) fooofw | 2 months ago The tokenization can represent uncommon words with multiple tokens. Inputting your example on https://platform.openai.com/tokenizer (GPT-4o) gives me (tokens separated by "|"): lower|case|un|se|parated|name discuss order hn newest No comments yet.
No comments yet.