Does no one build their own ml algos anymore? I don't understand the need for pytorch and tensor flow. I honestly thought tensor flow was nothing but a teaching thing for undergrads
This type of reasoning can be extended to any high-level tool. " Does no one writes there own OS. I don't understand the need for Linux or windows. I honestly thought windows or linux was nothing but a tool for undergrads to use Excel or host a WordPress site". And this is not a caricature of your argument. There is a lot of stuff under the hood that Tensorflow or Pytorch implement for a programmer. So much so that people have written wrapper for using TF or Pytorch to even further abstract the working of the library. Implementing deeplearning architecture is less of a science and more of a "let me try this or that" and iterating ideas quickly if of the utmost importance. Also, I can implement a neural network in C (CUDA) although not the auto diff part, but I could if given time to research) but if I started implementing my own library, it would take an order (or even more) of magnitude more time to do the stuff I do daily. We don't need to reinvent the wheel here guys.
That's what I'm getting at, the stuff under the hood is what's important, devil in the details and all that. I'm also a quant so every ml algo needs to be tailored so idk
Do you also write your own automatic differentiation tools? Using libraries like TF and PyTorch makes sense if you use neural networks because they provide automatic differentiation (who wants to write out their gradients by hand?) and standard neural network components.
Edit: If your algorithm is not using neural networks, then libraries like TF may or may not be a good fit, it depends on the algorithm.
Writing custom low-level code can still make sense in those cases.
Although the endpoint is likely to be a better understanding of the choices made by a mature implementation, and of the work involved in fixing up edge cases.
Not all of us need to build their own ML algos. Just in the same way that not all of us need to build their sorting libraries or data structures. Some people are specialized in this to develop and do research. While other software engineers just want something they can use without much hassle and just a superficial understanding.
They're frameworks which implement high performance tools commonly used in ml problems like tensor operations, automatic differentiation, various gradient descent optimisers, and also neural network building blocks
I am sure you could write stuff like Diffentiable Processors or the like from scratch with numpy but if you respect yourself and your time, you won’t. Complicated architectures are orders of magnitude harder than writing feed forward networks from scratch. For example, see the Merlin paper.
tsm212|6 years ago
__Asturias__|6 years ago
ibab|6 years ago
Edit: If your algorithm is not using neural networks, then libraries like TF may or may not be a good fit, it depends on the algorithm. Writing custom low-level code can still make sense in those cases.
improbable22|6 years ago
http://blog.rogerluo.me/2018/10/23/write-an-ad-in-one-day/
http://blog.rogerluo.me/2019/07/27/yassad/
Although the endpoint is likely to be a better understanding of the choices made by a mature implementation, and of the work involved in fixing up edge cases.
kiloreux|6 years ago
morningseagulls|6 years ago
And yet they love to ask you to do exactly that at technical interviews... coming up next: what ML algos you need to know to ace that interview.
amelius|6 years ago
dlphn___xyz|6 years ago
TruckingThrow|6 years ago
What I have to say is this: please don't build your own.
falkaer|6 years ago
ghaff|6 years ago
morningseagulls|6 years ago
Some people do. It's a good challenge.[0]
[0] https://cryptopals.com/
halfarmbandit|6 years ago
I am sure you could write stuff like Diffentiable Processors or the like from scratch with numpy but if you respect yourself and your time, you won’t. Complicated architectures are orders of magnitude harder than writing feed forward networks from scratch. For example, see the Merlin paper.
mrfox321|6 years ago
et2o|6 years ago
mlevental|6 years ago
__Asturias__|6 years ago