top | item 34549724

Classical ML Still Relevant?

17 points| Sanej | 3 years ago

** Pls bear with me if its been already discussed or not related **

With all the proliferation of DL and LLM along with near unlimited compute, energy and bandwidth do we still need classical ML approach for solving the problems? Is DL / NN going to take over everything?

19 comments

order

apohn|3 years ago

One of the ways I think about this type of problem is by asking "You want to use computation to extract a signal from this data. What's that signal worth to you in business ROI dollars?"

If Domain Expertise + Feature Engineering + ML model can get you 90% of the way there and it runs on a tiny cloud instance that takes 30 minutes to train, is a DL based approach that pushes you to 91% worth it from an ROI instance if takes a 4xGPU cluster 2 days to train it, not to mention inference costs? Especially if you need to explain what the model is doing?"

This above is exactly the situation I'm in now with my job. I'm on the "Get useful stuff to production so we can save money" side of things, and we have R&D teams who try to approach the same problems using DL and all the latest methods. At least for the use cases our team focuses on, they haven't been able to do more than set $$$ on fire via GPUs. For us, Domain Knowledge + Good Data Engineering is the secret.

I think ML is going to be around for a long time because it works, even though DL is dominating the news right now. Just because a neurologist can also diagnose and treat common medical conditions (e.g a pneumonia), that doesn't mean we need every doctor to be neurologist.

Sanej|3 years ago

this is awesome! thanks

Salgat|3 years ago

Classical ML is still the dominant form of ML, and is preferred for most forms of tabular data (think spreadsheets). It's just far faster and often more effective than deep learning. Deep learning's greatest strength is that it can do the feature generation for you, which is great for more abstract data inputs such as pixel arrays and word sequences. Deep learning receives a lot more attention because it's doing things that normally would require a human to do.

softwaredoug|3 years ago

The "model" is the boring part of ML.

ML isn't deep learning or not deep learning. It's fundamentally to me about a statistical formulation of a business problem.

It's how you would evaluate ML, formulate business tasks into an objective function, understand and develop training data, and what the features actually measure what’s important in the domain.

Sanej|3 years ago

awesome!

PaulHoule|3 years ago

I don't see a discontinuity.

There are problems where classical ML works fine and if it works, why change it?

In text classification it depends on the problem but often the old methods work very well and there is not a lot of room for neural methods to do better.

For images or audio however I think a deep network would almost always be in the picture.

Often people use a pretrained neural network to make an embedding and then use classical ML methods to make a classifier that works on that embedding.

The data prep and evaluation process is very much the same no matter what kind of model you are using.

fdgsdfogijq|3 years ago

"text classification it depends on the problem but often the old methods work very well and there is not a lot of room for neural methods to do better."

This couldnt be further from the truth. NLP/text algorithms have seen model improvements from NNs more than any other field.

alpineidyll3|3 years ago

Basically in any cases of small data? Anything with less than 8000 points or so. It's a struggle to avoid ingesting bias into a deep model for such tiny data.

That said, it's pretty saturated as a field of study. People work on uncertainty quantification etc. But it's unclear what numbers people would want to improve.

jononor|3 years ago

The combination can be very useful sometimes, for example for transfer learning for working with low resource datasets/problems. Use a deep neural network to go from high dimensionality data to a compact fixed length vector. Basically doing festure extraction. This network is increasingly trained on large amounts of unlabeled data, using self-supervision. Then use simple classical model like a linear model, Random Forest or k-nearest-neighbours to make model for specialized task of interest, using a much smaller labeled dataset. This is relevant for many task around sound, image, multi-variate timeseries. Probably also NLP (not my field).

Sanej|3 years ago

cool!

PartiallyTyped|3 years ago

> Is DL / NN going to take over everything?

It will only take over the cases where you have vast swaths of data, don't have reasonable preprocessing approaches that simplify the task, and don't need statistical guarantees.

psyklic|3 years ago

"classical ML" is not very different from NNs. For example, basic NNs are essentially logistic regressors chained together. And NNs are evaluated and trained very similarly as simpler models (gradient descent, log loss, etc).

NNs also often perform similarly or worse than simpler models when you have "medium-sized" (and/or tabular) data. In fact, I nearly always start with simpler models when consulting -- why immediately make it complicated if a smaller, more interpretable model works well?

michaericalribo|3 years ago

It still has a place, but with precise targeting at specific problems. Generalized solutions that prioritize accuracy over all else may be successful at a wider variety of tasks, but there will always be a role for statistical analysis and modeling that can achieve sufficient accuracy to be useful.

stevofolife|3 years ago

Tabular data. Requirements for explainability and interpretability.