(no title)
maiybe | 6 years ago
One area I'd push back on is that "this is not the fault of Tensorflow." An area of weakness for Tensorflow is that it solves a number of DL problems with a specialized API call. That's not an asset, that's a liability.
LSTMs were always a pain point. So much so that for Tensorflow projects, I gave up and insisted on traditional feedforward approaches like CNNs + MLPs or ResNets when LSTMs would be viable. Mostly identical performance with decent speed boosts from avoiding recurrence, and the simpler code reduced maintenance by non-ML engineers.
As soon as you branch out of standard DL bread and butter models, you spend frustratingly long periods of time tracking down obscure solutions in a part of the API space that had its own hard-to-follow logic.
Every time I'd point out that it's hard to do something either in forums or HN directly, I'd get a response that its easy to do with [insert-random-api] function call.
In the end, it's my opinion that Tensorflow will lose out to JAX and Pytorch, by no fault other than its own complicated construction.
sseveran|6 years ago
Personally I think Tensorflow has already lost and we just need to let it play out over the next few years. One interesting wrinkle is that since Trax, Jax and Flax utilize pieces of Tensorflow the TF team can probably claim good internal adoption numbers depending on how they count.
lowdose|6 years ago