The progress academics continuously make in NLP take us closer and closer to a local maxima that, when we reach it will be the mark of the longest and coldest AI winter ever experienced by man, because of how far we are from a global maxima and, progress made by academically untrained researchers will, in the end, be what melts the snow, because of how "out-of-the-current-AI-box" they are in their theories about language and intelligence in general.
heyitsguay|6 years ago
spinningslate|6 years ago
I'd contend that, for the most part, it doesn't matter. It's a bit like the whole ML vs AGI debate ("but ML is just curve fitting, it's not real intelligence"). The more pertinent question for human society is the impact it has - positive or negative. ML, with all its real or perceived weaknesses, is having a significant impact on the economy specifically and society generally.
It'll be little consolation for white collar workers who lose their jobs that the bot replacing them isn't "properly intelligent". Equally, few people using Siri to control their room temperature or satnav will care that the underlying "intelligence" isn't as clever as we like to think we are.
Maybe current approaches will prove to have a cliff-edge limitation like previous AI approaches did. That will be interesting from a scientific progress perspective. But even in its current state, contemporary ML has plently scope to bring about massive changes in society (and already is). We should be careful not to miss that in criticising current limitations.
mrec|6 years ago
I found this sentence particularly intriguing given that John Carmack recently announced that he was switching his main focus to AI.
dna_polymerase|6 years ago
This is exactly it. Every previously seen AI Winter had in common that funding was cut back. However, Google or other companies in the realm could approach a point where further investment wouldn't make sense to them. Until then there won't be a Winter, maybe Autumn, because smaller players disappear.
feral|6 years ago
I've direct evidence of that from my day job (building NLP chat bots for Intercom).
That business value will increase as we NLP progresses, even if we're moving towards a local optimum.
Even if we do get stuck, real products and real revenue powered by NLP will help fund research on successive generations.
Of course theres tons of hype about AI. But theres also a big virtuous cycle which just wasn't present in the setup which created previous AI winters.
screye|6 years ago
Progress is always that way. It plateaus, then suddenly jumps and then plateaus again.
If you complaint is about the general move away from statistics and deep learning becoming the norm, then there are a pretty decent number of labs who are working on coming up with whatever the next deep learning is. There is probabilistic programming and there models are some models with newer biologically inspired computation structures.
Even inside ML and deep learning, people are trying to come up with ways to better leverage unsupervised learning and building large common sense representations of the world.
There is certainly an oversupply of applied deeplearning practitioners, but there are other approaches being explored in the AI/ML community too.
teshier-A|6 years ago
octbash|6 years ago
Making progress of NLP benchmarks? Must be a sign that we're moving even closer to an even longer and more bitter AI winter.
tanilama|6 years ago
The current A.I. B.O.O.M is due to end or it is ending already, but this only means now we are equipped with really powerful approximators previous generation of researchers would not even dream of, that we left us with a really tantalizing question:
What is the right question to ask?
We have undoubtedly proved machine are superior to fit, now we need to make them curious.
317070|6 years ago
The open question is whether AGI is the same as Schmidhuber's optimum, or even lies within Schmidhuber's basin.
[0] Cambridge style debate on the topic at Neurips 2019.
tjansen|6 years ago
The_rationalist|6 years ago
visarga|6 years ago
ML is an empirical science, or a craft if you want, with useful applications. It's not the ultimate theory of intelligence.
Iv|6 years ago
2sk21|6 years ago
slumdev|6 years ago
This is an unnecessarily uncharitable view of academia.
"Outside the box thinking" is frequently just ignorance and Dunning-Kruger.
laretluval|6 years ago