DLEnthusiast's comments

DLEnthusiast | 8 years ago | on: Ask HN: Why TensorFlow instead of Theano for deep learning?

All of these frameworks are "relatively new". TensorFlow: 1.6 years. CNTK: 1 year. PyTorch: 0.5 year. Are they really impossible to compare?

> When people say PyTorch is better for research, they mean

That's not what "people" say. They tend to say the opposite. Maybe we can ask OP what he meant when he said it.

> it is easier to implement non-trivial network architectures with it, such as recursive network

It is interesting that you mention recursive networks. There are only a few dozens of researchers who work with recursive networks, and they are all accounted for, we know what tools they use. They use Chainer and DyNet.

DLEnthusiast | 8 years ago | on: Ask HN: Why TensorFlow instead of Theano for deep learning?

"PyTorch is better for research" is a weird, unsubstantiated statement. The fact is that few serious researchers use PyTorch (and even those complain about it). It's mostly grad students in a handful of labs. The only researchers I know who use PyTorch have been from FaceBook, and that's because they were implicitly forced to use it (PyTorch is developed by FaceBook).

According to https://medium.com/@karpathy/icml-accepted-papers-institutio... , 3 of the top research labs in the world are DeepMind, Google Brain (and the rest of Google), and Microsoft Research. Let's see:

* DeepMind: TensorFlow

* Google Brain: TensorFlow

* Microsoft Research: CNTK

Ok, so what about academia? The top deep learning groups in academia are:

* Montreal: Theano

* Toronto: TensorFlow

* IDSIA: TensorFlow

So, what about the greater academic research community? Maybe we could get some data about who uses what by looking at the frameworks cited by researchers in their papers. Andrej did that: it's mainly TensorFlow and Caffe. https://medium.com/@karpathy/a-peek-at-trends-in-machine-lea...

page 1