top | item 44430415

(no title)

rsfern | 8 months ago

This paper “were RNNs all we needed?” explores this hypothesis a bit, finding that some pre-transformer sequence models can match transformers when trained at appropriate scale. Though they did have to make some modifications to unlock more parallelism

https://arxiv.org/abs/2410.01201

discuss

order

No comments yet.