top | item 45353139

(no title)

matheist | 5 months ago

Oh now I understand. I thought your ab and ba were single tokens (even though that doesn't make sense in context). Once you point out they're separate tokens, I follow you. Thank you!

Edit: that's a great example

Edit 2: even more fun: training data is [ab, ab, ba, bb, bb, bb]. Then constrained sampling flips your likelihood from 1:2 to 2:1

discuss

order

hansvm|5 months ago

Thanks :) My example is minimal, which is a little nice since I wind up re-deriving it in a hurry every time I need it. I do like the 1:2 to 2:1 symmetry though. Very elegant.