There was Chainer, which originated the define-by-run model that characterized PyTorch’s effectiveness. It was developed by a much smaller, much less influential company in Japan. Early PyTorch is transparent about the debt owed to Chainer.
Thanks. Yes, I remember Chainer, but only vaguely. I kinda remember looking at it, but not actually using it.
My recollection is that when I looked at Chainer back then, it didn't offer a comprehensive library of preexisting components for deep learning. When I tried PyTorch, on the other hand, I vividly remember it as already having lots of prebuilt components (common layers, activation functions, etc.) in `torch.nn`, so it was easier and faster to get going.
Yes, exactly—not many people know about Chainer nowadays. Back in 2016, PyTorch's interface was actually inferior to Chainer's, and I think Chainer's design was really ahead of its time.
cs702|3 months ago
My recollection is that when I looked at Chainer back then, it didn't offer a comprehensive library of preexisting components for deep learning. When I tried PyTorch, on the other hand, I vividly remember it as already having lots of prebuilt components (common layers, activation functions, etc.) in `torch.nn`, so it was easier and faster to get going.
These memories are vague, so I could be wrong.
maxc01|3 months ago
NalNezumi|3 months ago