(no title)
gnulinux996 | 5 months ago
Effectively telling that junior developers "don't have brains" is in very bad taste and offensively wrong.
> people would rather die than admit that there's very little practical difference between their own "thinking" and that of an AI chatbot.
Would you like to elaborate on this?
I was told that McDonalds employees would have been replaced by now, self-driving cars will be driving the streets and new medicines would have been discovered.
It's been a couple of years that "AI" is out, and no singularity yet.
ACCount37|5 months ago
Setting the bar for "AI" at "singularity" is a bit like setting requirements for "fusion" at "creating a star more powerful than the Sun". Very good for dismissing all existing fusion research, but not any good for actually understanding fusion.
If we had two humans, one with IQ 80 and another with IQ 120, we wouldn't say that one of them isn't "thinking". It's just that one of them is much worse at "thinking" than the other. Which is where a lot of LLMs are currently at. They are, for all intents and purposes, thinking. Are they any good at it though? Depends on what you want from them. Sometimes they're good enough, and sometimes they aren't.
habinero|5 months ago
It's surprising you say that, considering we don't actually understand the mechanisms behind how humans think.
We do know that human brains are so good at patterns, they'll even see patterns and such that aren't actually there.
LLMs are a pile of statistics that can mimic human speech patterns if you don't tax them too hard. Anyone who thinks otherwise is just Clever Hans-ing themselves.
jimbo808|5 months ago
XenophileJKO|5 months ago
It is so weird when people pull self driving cars out as some kind of counter example. Just because something doesn't happen on the most optimistic time scale, doesn't mean it isn't happening. They just happen slowly and then all at once.
jimbo808|5 months ago