top | item 41703309

(no title)

Atheb | 1 year ago

You got to give it to the pytorch team, they're really great at bringing complex optimization schemes (mixed-precision, torch.compile, etc) down to a simple to use API. I'm glad I moved from TF/Kerasto Pytorch around 2018-2019 and never looked back. I'm eager to try this as well.

discuss

order

ansk|1 year ago

I've seen and ignored a lot of "pytorch good, tensorflow bad" takes in my time, but this is so egregiously wrong I can't help but chime in. Facilitating graph-level optimizations has been one of the most central tenets of tensorflow's design philosophy since its inception. The XLA compiler was designed in close collaboration with the tensorflow team and was available in the tensorflow API as far back as 2017. It's not an exaggeration to say that pytorch is 5+ years behind on this front. Before anyone invokes the words "pythonic" or "ergonomic", I'd like to note that the tensorflow 2 API for compilation is nearly identical to torch.compile.

brrrrrm|1 year ago

it's not about the API. its about the documentation + ecosystem.

TF's doesn't seem very good. I just tried to figure out how to learn a linear mapping with TF and went through this:

1. googled "linear layer in tensorflow" and got to the page about linear.

2. spent 5 minutes trying to understand why monotonicity would be a central tenet of the documentation

3. realizing that's not the right "linear" I couldn't think of what the appropriate name would be

4. I know MLPs have them, google "tensorflow mlp example"

5. click the apr '24 page: https://www.tensorflow.org/guide/core/mlp_core

6. read through 10[!] code blocks that are basically just boiler-plate setup of data and visualizations. entirely unrelated to MLPs

7. realize they call it "dense" in tensorflow world

8. see that "dense" needs to be implemented manually

9. think that's strange, google "tensorflow dense layer"

10. find a keras API (https://www.tensorflow.org/api_docs/python/tf/keras/layers/D...)

marcinzm|1 year ago

Tensorflow works really well in theory. In practice a lot less so. I saw someone spend months fighting Tensorflow to convert a production model from CPU to GPU inference with any sort of efficiency. Tons of issues due to bugs across versions, deprecations of features across versions, the graph optimizer shuffling data back to the CPU for no decent reason, etc. The person had no idea what was happening or why most of the time due to how black box Tensorflow was. This was a very senior ML engineer with a lot of Tensorflow experience.

lgessler|1 year ago

GP wrote "simple to use API". You can attribute many qualities to TensorFlow, but this is not one of them.

dekhn|1 year ago

Does tensorflow have a future? I doubt it. I don't think Google is really investing many resources into it (beyond the necessary maintainence to support whatever production models still depend on it). The cost of migrating from old TF to new TF was really large, half the projects that depend on TF that I try to use just break out of the box (only 1/4 of torch projects I try fail that way).

From what I can tell Google is moving in a direction that doesn't require tensorflow, and I don't see it gaining signficant adoption outside google, so it seems most likely we will simply see it deprecated in about 10 years. It's best to see it as a transitional technology that Jeff Dean created to spur ML development internally, which was mistakenly open sourced, and now, Jeff's reports typically use Jax or other systems.

catgary|1 year ago

I think tensorflow-datasets and tensorflow-serving are great, but for model development I think most people use JAX and then export it to a tensorflow SavedModel with Orbax.

zozbot234|1 year ago

> Facilitating graph-level optimizations has been one of the most central tenets of tensorflow's design philosophy since its inception.

Agreed of course but it's not like they came up with this approach from scratch. They seem to have just picked it up from Theano (now Aesara/PyTensor).

YetAnotherNick|1 year ago

+1. As someone who has tried to migrate multiple tf.function to torch.compile, tensorflow edge is not small in this. torch.compile still is highly highly experimental. Don't believe me, just go and look into github issues as torch maintainers try to figure why torch.compile makes code very unoptimal in lot of cases, or results in incomprehensible errors.

whymauri|1 year ago

I'm so sorry but Tensorflow is simply one of the worst parts of my job.

uoaei|1 year ago

Praising XLA by defending Tensorflow of all things has to be one of the strangest takes I've ever come across.

JAX is right there. No need to beat a dead horse when there's a stallion in the stables.

yablak|1 year ago

Best way to use tensorflow is by writing models in Jax.