I agree that DL/ML is destined to fail in domains like this but can you expand on this reasoning? What exactly do you mean by "quantitative math" (I haven't heard this phrase used in this way before)? And what were the equivalents of DL/ML for physics before calculus?
acdc4life|6 years ago
>And what were the equivalents of DL/ML for physics before calculus?
This is a good question. Before Newtonian calculus, and the laws of gravitation, people were building very complicated conic models (ie. eclipses, parabolas etc.) to get better and better prediction of planetary motion. A lot of parametric math came out of this, with many sophisticated models getting better and better, giving these astronomers an illusion of progress. However, Newton's insight was that motion is connected to mass, and this insight was the basis of how to derive the laws of motion, which gave us the laws of gravitation (F = (Gm1m2/r^2)). This insight eliminated the previous Keplarian models of motion, because you were now able to predict the motion of arbitrary rigid bodies using very simple math (we teach this in highschool). Ofcourse, Newtonian motion has its limitations that's why we have quantum physics and Einstien's relativity theory. But for practical technological applications, Newtonian physics on its own gets you incredibly far.
Where is ML/DL? It would be akin to Keplarian elliptical motion. More realistically however, it's closer to aether theory of light, and will go the way of GOFAI. This stuff isn't grounded in modelling any scientific observation. Moreover, they are mathematically useless. Back propagation doesn't converge, and why should you fit your data to an arbitrary mathematical structure? In practise, DL/ML doesn't work at all, you will be much more successful by modelling your problem mathematically. For example, consider an automobile manufacturer, which has all kinds of moving parts in their planes. They typically model each part mathematically (ie. gear x under goes exponential time decay), and imply their parameters using rigours test data. Then you use some sort of an empirical statistical model to predict the failure.
I've seen deep learning companies come and fall flat on their face trying to beat the accuracy of these deterministic systems. Those guys needed a lot of data, and GPUs. I'm not even criticizing the fact that DL is a black box. It's worse, it's inferior to everything out there on every metric imaginable. These mathematical models in contrast have been in production for decades, with yearly updates, and they run in real time with little historical data, they are fully understandable and they beat every method we know of.
This isn't the first time multi layer perceptrons gained hype. They didn't work in the 80s, or 90s or the 2000s, they don't work now. The math behind DL is the same that we had in the 80s, they just called it multi layer perceptron. None of the ideas in modern ML/DL are new, all these ideas like reinforcement learning, GANs etc.
chillacy|6 years ago
2. Likewise Newtonian physics is also an approximation: it does not fare well near relativistic speeds or high gravity. But at least we have models which seem to be accurate to many decimal places today. Who knows what the future may hold.
3. Not all useful problems can be represented by simple equations, but they can be computed analytically (e.g. N-body problem).
4. Ultimately DL is popular because it works better than anything else in some very specific domains like speech recognition and image recognition. It is overapplied I'll admit, but if you can do better then feel free to publish a paper.