top | item 24534033

(no title)

bodono | 5 years ago

I'm not sure this is really going to take off, it seems that most people who are abandoning TF are moving to Jax or pytorch. My own experience with Jax is that it is much easier to use then TF, just an all round more pleasant experience. It would be interesting to try this, but at this point I'm not really willing to learn 'yet another deep learning framework' and the extreme anti-user problems that TF had make me loath to give it another shot, even with a presumably better frontend. Moreover, I think that python is just a better all-round ML/data science language at this point. Has anyone tried both Jax and this and would be willing to give us their thoughts on strengths and weaknesses of each?

discuss

order

gas9S9zw3P9c|5 years ago

I'm skeptical of JAX. It feels good right now, but when the first TF beta version came out it was very much like that too - clean, simple, minimal, and just a better version of Theano. Then the "crossing the chasm" effort started and everyone at Google wanted to be part of it, making TF the big complex mess it is today. It's a great example of Conway's Law. I'm not convinced the same won't happen to JAX as it catches on.

PyTorch has already stood the test of time and proven that its development is led by a competent team.

bodono|5 years ago

I know where you're coming from, but TF in my opinion was very user-hostile even on arrival. I can't tell you how much hair-pulling I did over tf.conds, tf.while_loops and the whole gather / scatter paradigm for simple indexing into arrays. I really think the people working on it wanted users to write TF code in a certain, particular way and made it really difficult to use it in other ways. Just thinking back on that time still raises my blood pressure! So far Jax is much better and I'm cautiously optimistic they have learned lessons from TF.

iflp|5 years ago

> I'm not convinced the same won't happen to JAX

And now there are already multiple NN libraries for JAX from Google...

MiroF|5 years ago

All I want is a way to statically type check tensor axes. Why can't I get a way to statically type check tensors?

alpineidyll3|5 years ago

The subtext is Google would love even more Google projects to be ml prerequisites.

sandGorgon|5 years ago

I have just started hearing about Jax. But it seems to be a low level library that Tensorflow uses right ?

The latest release of Tensorflow probability uses JAX under the hood. So what do you mean when you say you're moving to JAX versus Tensorflow

joaogui1|5 years ago

In your first sentence you're mistaking JAX and XLA

XLA: Accelerated Linear Algebra, I guess it's kind of a backend/compiler that optimizes Linear Algebra/Deep Learning calculations with some very interesting techniques, among them fusing kernels

JAX: In some sense syntax sugar over XLA, but a better way of describing it is Composable transformations + Numpy + some Scipy. The composable transformations allow you to take derivatives (be them single, multi or vector valued functions and also higher order derivatives), JIT a function (which is them compiled to XLA), 2 forms of parallelism (vmap and pmap) and others, while being compatible with one another and with both TPUs, GPUs and CPUs

sakex|5 years ago

I don't think that the main goal of TF on Swift is to train models using Swift. I think it's mainly to deploy them in production on iPhones