Quite a lot of techniques in deep learning have stood the test of time at this point. Also new techniques are developed either depending on or trying to solved deficiencies in old techniques. For example Transformers were developed to solve vanishing gradients in LSTMs over long sequences and improve GPU utilization since LSTMs were inherently sequential in the time dimension.
ldjkfkdsjnv|2 years ago
opportune|2 years ago
Also, being an "expert in LSTM" is like being an "expert in HTTP/1.1" or "knowing a lot about Java 8". It's not knowledge or a skill that stands on its own. An expert in HTTP/1.1 is probably also very knowledge about web serving or networking or backend development. HTTP/2 being invented doesn't obsolete the knowledge at all. And that knowledge of HTTP/1.1 would certainly come in handy if you were trying to research or design something like a new protocol, just as knowledge of LSTMs could provide a lot of value for those looking for the next breakthrough in stateful models.
HighFreqAsuka|2 years ago
unknown|2 years ago
[deleted]