top | item 45645680

(no title)

thatguysaguy | 4 months ago

Back when BERT came out, everyone was trying to get it to generate text. These attempts generally didn't work, here's one for reference though: https://arxiv.org/abs/1902.04094

This doesn't have an explicit diffusion tie in, but Savinov et al. at DeepMind figured out that doing two steps at training time and randomizing the masking probability is enough to get it to work reasonably well.

discuss

order

thatjoeoverthr|4 months ago

Im just learning this from your text, after spending last week trying to get a BERT model to talk.

https://joecooper.me/blog/crosstalk/

I’ve still got a few ideas to try though so I’m not done having fun with it.

Anon84|4 months ago

The trick is to always put the [MASK] at the end:

"The [MASK]" "The quick [MASK]" etc

binarymax|4 months ago

Interesting as I was in the (very large) camp that never considered it for generation, and saw it as a pure encoder for things like semantic similarity with an easy jump to classification, etc