(no title)
bodono
|
5 years ago
I'm not sure this is really going to take off, it seems that most people who are abandoning TF are moving to Jax or pytorch. My own experience with Jax is that it is much easier to use then TF, just an all round more pleasant experience. It would be interesting to try this, but at this point I'm not really willing to learn 'yet another deep learning framework' and the extreme anti-user problems that TF had make me loath to give it another shot, even with a presumably better frontend. Moreover, I think that python is just a better all-round ML/data science language at this point. Has anyone tried both Jax and this and would be willing to give us their thoughts on strengths and weaknesses of each?
gas9S9zw3P9c|5 years ago
PyTorch has already stood the test of time and proven that its development is led by a competent team.
bodono|5 years ago
iflp|5 years ago
And now there are already multiple NN libraries for JAX from Google...
MiroF|5 years ago
BadInformatics|5 years ago
- Dex: https://github.com/google-research/dex-lang/ - Hasktorch: https://github.com/hasktorch/hasktorch - This initiative from the Python Typing-sig: https://docs.google.com/document/d/1oaG0V2ZE5BRDjd9N-Tr1N0IK...
marmaduke|5 years ago
https://futhark-lang.org/blog/2020-03-15-futhark-0.15.1-rele...
and it seems to be ok for DL
https://elsman.com/pdf/fhpnc19.pdf
atorodius|5 years ago
alpineidyll3|5 years ago
sandGorgon|5 years ago
The latest release of Tensorflow probability uses JAX under the hood. So what do you mean when you say you're moving to JAX versus Tensorflow
joaogui1|5 years ago
XLA: Accelerated Linear Algebra, I guess it's kind of a backend/compiler that optimizes Linear Algebra/Deep Learning calculations with some very interesting techniques, among them fusing kernels
JAX: In some sense syntax sugar over XLA, but a better way of describing it is Composable transformations + Numpy + some Scipy. The composable transformations allow you to take derivatives (be them single, multi or vector valued functions and also higher order derivatives), JIT a function (which is them compiled to XLA), 2 forms of parallelism (vmap and pmap) and others, while being compatible with one another and with both TPUs, GPUs and CPUs
sakex|5 years ago