top | item 41140549

(no title)

alexgarcia-xyz | 1 year ago

You can generate "text embeddings" (ie vectors) from your text with an embedding model. Once you have your text represented as vectors, then "closest vector" means "more semantically similar", as defined by the embedding model you use. This can allow you to build semantic search engines, recommendations, and classifiers on top of your text and embeddings. It's kindof like a fancier and fuzzier keyword search.

I wouldn't completely replace keyword search with vector search, but it's a great addition that essentially lets you perform calculations on text

discuss

order

yAak|1 year ago

Thank you! This was an extremely helpful explanation to me, as someone not familiar on the topic. =)

bodantogat|1 year ago

Nice explanation. One use case where keywords haven't worked well for me , and (at least at first glance) vectors are doing better are longer passages -- finding sentences that are similar rather than just words.