top | item 44824619

(no title)

HsuWL | 6 months ago

I really liked your Detroit Tigers + lettuce in Berlin example. It nails one of the core problems: language models are still dealing with “relatedness” in a super linear and flat way. They can’t really hold a jump like that.

When I brought up topology, I wasn’t talking about anything spatial. I meant more like a model’s thinking path needs to form its own system, a kind of closed semantic topology map.

Each node is a meaning unit, all linked by invisible threads. The input sentence is like a little pacman moving through the map⸜( ´͈ Ⱉ `͈ )⸝ pulled along by those threads until it reaches the node that resonates the most. That’s where the answer comes from.

So it’s not calculating, it’s being guided. Kinda like gravity, but made out of meaning.

What you described feels super close to this. Maybe that’s what context modeling is really heading toward… We just haven’t found the right way to talk about it yet. Σ(๑Ⱉ⸝⸝Ⱉ๑;)੭⁾⁾

discuss

order

ricardobeat|6 months ago

Why do you talk/write like ChatGPT?

HsuWL|6 months ago

Because I’m not a native English speaker, I translate my comments by GPT.Haha

因為我用GPT幫我翻譯的啦!Σ(๑Ⱉ⸝⸝Ⱉ๑;)੭⁾⁾我的母語是中文 所以有點GPT味 我已經努力在訓練他講話流暢一點了啦哭哭