top | item 45550877

(no title)

diyer22 | 4 months ago

It does seem that way — we’re both trying to overcome the limitations imposed by LLM tokenization to achieve a truly end-to-end model.

And, their work is far more polished; I’ve only put together a quick GPT+DDN proof-of-concept.

Thank you for sharing.

discuss

order

No comments yet.