top | item 46447355

(no title)

bfeynman | 2 months ago

lots of highfalutin language trying to make something thats pretty hand wavy look like it's not. Where are the benchmarks? The "vector algebra" framing with @X + @Y - @Z is a falsehood. Embedding spaces don't form any meaningful algebraic structure (ring, field, etc.) over semantic concepts, you're just getting lucky by residual effects.

discuss

order

Xyra|2 months ago

I'm giving you, the user, the easiest ability you've most likely ever had to explore embedding space yourself. Embeddings are tricky and can mislead, but they do often compose surprisingly intuitively, especially when you've played and built up a bit of an intuition for it.

edmundsauto|2 months ago

What is the impact of misleading embeddings, how do they compose? I honestly am interested but don't know enough to understand what you're saying.

Why would I want to explore the embedding space myself, isn't this a tool where I can run cross-data exploratory analyses against unstructured data, where it's pre-populated with content?

Xyra|1 month ago

We can iterate fast with understanding useful paradigms of vector manipulation. Yesterday I added `debias_vector(axis, topic)` and l2_normalization guidance.

bfeynman|1 month ago

The manifold structure of embedding spaces isn't semantically uniform, you've found a nice little novelty thing but it's not rigorous, and using AI slop to name this vector algebra instead of finding or running a benchmark to show that its actually works better.