top | item 19506608

(no title)

slashcom | 7 years ago

And replaced by residual connections in transformers, which are absolutely dominating LSTMs now.

discuss

order

stochastic_monk|7 years ago

Transformer-XL uses recurrence, and most NLP SOTA is still with LSTMs. I’m not sure I’d expect attention mechanisms to fully replace recurrence.