top | item 44254201

(no title)

rajman187 | 8 months ago

That’s why you have encoders as well as decoders. For example, another model from Meta does this for translations; they have encoders and decoders into a single embedding space that represents semantic concepts for each language

https://ai.meta.com/research/publications/sonar-sentence-lev...

discuss

order

No comments yet.