(no title)
keeeba | 2 months ago
Larger, more capable embedding models are better able to separate the different uses of a given word in the embedding space, smaller models are not.
keeeba | 2 months ago
Larger, more capable embedding models are better able to separate the different uses of a given word in the embedding space, smaller models are not.
Xyra|2 months ago
When Claude is using our embed endpoint to embed arbitrary text as a search vector, it should work pretty well cross-domains. One can also use compositions of centroids (averages) of vectors in our database, as search vectors.
A4ET8a8uTh0_v2|2 months ago