For a lot of embeddings we have today, norm of any embedding vector is roughly of same size, so the angle between two vectors is roughly same size as length of difference that you are saying, and can be expressed in terms of 1 - dot product after scaling
I don't have an answer for this really outside of silly ones like "strict equality check", but I assert that no one else does either, at least today and right now, and its an inherent limitation due to the nature of embeddings and the space it desires to be (cheap, fast, good enough similarity for your use case).
You're probably best off using the commercial suggestion, and if its dot product, go for it. I am no expert in this area and my interest wanes every day.
scotty79|2 years ago
Means squared error instead of dot product, it's not cheaper but it's close
If you want to go cheaper you could use sum of abs of differences.
soarerz|1 year ago
For a lot of embeddings we have today, norm of any embedding vector is roughly of same size, so the angle between two vectors is roughly same size as length of difference that you are saying, and can be expressed in terms of 1 - dot product after scaling
latency-guy2|2 years ago
You're probably best off using the commercial suggestion, and if its dot product, go for it. I am no expert in this area and my interest wanes every day.
unknown|2 years ago
[deleted]
gajus|2 years ago