top | item 45358238

(no title)

gnulinux996 | 5 months ago

> That would have had more weight if you haven't just described junior developer behavior beforehand.

Effectively telling that junior developers "don't have brains" is in very bad taste and offensively wrong.

> people would rather die than admit that there's very little practical difference between their own "thinking" and that of an AI chatbot.

Would you like to elaborate on this?

I was told that McDonalds employees would have been replaced by now, self-driving cars will be driving the streets and new medicines would have been discovered.

It's been a couple of years that "AI" is out, and no singularity yet.

discuss

order

ACCount37|5 months ago

LLMs use the same type of "abstract thinking" process as humans. Which is why they can struggle with 6-digit multiplication (unlike computer code, very much like humans), but not with parsing out metaphors or describing what love is (unlike computer code, very much like humans). The capability profile of an LLM is amusingly humanlike.

Setting the bar for "AI" at "singularity" is a bit like setting requirements for "fusion" at "creating a star more powerful than the Sun". Very good for dismissing all existing fusion research, but not any good for actually understanding fusion.

If we had two humans, one with IQ 80 and another with IQ 120, we wouldn't say that one of them isn't "thinking". It's just that one of them is much worse at "thinking" than the other. Which is where a lot of LLMs are currently at. They are, for all intents and purposes, thinking. Are they any good at it though? Depends on what you want from them. Sometimes they're good enough, and sometimes they aren't.

habinero|5 months ago

> LLMs use the same type of "abstract thinking" process as humans

It's surprising you say that, considering we don't actually understand the mechanisms behind how humans think.

We do know that human brains are so good at patterns, they'll even see patterns and such that aren't actually there.

LLMs are a pile of statistics that can mimic human speech patterns if you don't tax them too hard. Anyone who thinks otherwise is just Clever Hans-ing themselves.

jimbo808|5 months ago

This is wrong on so many levels. I feel like this is what I would have said if I never took a neuroscience class, or actually used an LLM for any real work beyond just poking around ChatGPT from time to time between TED talks.

XenophileJKO|5 months ago

Living in Silicon Valley, there are MANY self driving cars driving around right now. At the stop light the other day, I was between 3 of them without any humans in them.

It is so weird when people pull self driving cars out as some kind of counter example. Just because something doesn't happen on the most optimistic time scale, doesn't mean it isn't happening. They just happen slowly and then all at once.

jimbo808|5 months ago

15 years ago they said truck drivers would be obsolete in 1-2 years. They are still not obsolete, and they aren't on track to be any time soon, either.