(no title)
asey
|
3 years ago
Models like these don't see words as made of up letters but rather see the whole words (tokens) as a single entity. The result being they're not very good at creating novel (non-memorized) anagrams/palindromes and the like.
No comments yet.