top | item 45620930

(no title)

superconduct123 | 4 months ago

I always get a weird feeling when AI researchers and CS people start talking about comparisons between human brains and AI/computers

Why is there a presumption that we (as people who have only studied CS) know enough about biology/neuroscience/evolution to make these comparisons/parallels/analogies?

I enjoy the discussions but I always get the thought in the back of my head "...remember you're listening to 2 CS majors talk about neuroscience"

discuss

order

matusp|4 months ago

We should completely strip all this talk from AI as a field (and get rid of that name as well). It just causes endless confusion, especially for general audience. In the end, the whole shtick with LLMs is that we train matrices to predict next tokens. You can explain this entire concept without invoking AGI, Roko's basilisk, the nature of human consciousness, and all the other mumbo jumbo that tries so hard to make this field what it is not.

scotty79|4 months ago

But people love misguided narratives and analogies. How else should we kill time when we are to dumb to accelerate inevitable progress and just need to wait for it?

giardini|4 months ago

I eagerly await your publications. I will buy your book.

ainch|4 months ago

There is a lot of overlap between AI and Neuroscience, especially among older researchers. For example Karpathy's PhD supervisor, Fei-Fei Li, researched vision in cat brains before working on computer vision, Demis Hassabis did his PhD in Computational Neuroscience, Geoff Hinton studied Psychology etc... There's even the Reinforcement Learning and Decision Making conference (RLDM - very cool!), which pairs Reinforcement Learning with neuro research and brings together people from both disciplines.

I suspect the average AI researcher knows much more about the brain than typical CS students, even if they may not have sufficient background to conduct research.

superconduct123|4 months ago

Fair enough, I guess its a bit different nowadays since the background is usually a PhD in compsci

arawde|4 months ago

From personal experience making the same comparisons during undergrad, I think it just comes down to the availability of conceptual models. If the brain does X, there's a good chance that a computer does something that looks like X, or that X could be recreated through steps Y & Z, etc.

Once I started to realize just how much of the brain is inscrutable, because it is a machine operating on chemicals instead of strict electrical processing, I became a lot more reluctant to draw those comparisons

genewitch|4 months ago

Lucky for all of us we're alive during a "quantum" thing! Which has been an idea since at least the mid 1990s as i first saw it in a 2600 around that time...

chasd00|4 months ago

> Why is there a presumption that we (as people who have only studied CS) know enough about biology/neuroscience/evolution to make these comparisons/parallels/analogies?

well it's straightforward. First lets assume a spherical, perfectly frictionless, brain..

tim333|4 months ago

AI researchers and CS people and the rest of us are human brain users and so have some familiarity with them even if they haven't studied neuroscience.

You can make some comparisons between how they perform without really understanding how LLMs or brains work, like to me LLMs seem similar to the part human minds where you say stuff without thinking about it. But you never really get an LLM saying I was thinking about that stuff and figured this bit was wrong, because they don't really have that capability.

rhetocj23|4 months ago

Ive also found this jarring and it speaks to the hubris of folks that have emerged in the past few decades who dont seem to have much relation to the humanities and liberal arts.

jjulius|4 months ago

>Why is there a presumption that we (as people who have only studied CS) know enough about biology/neuroscience/evolution to make these comparisons?

Hubris.

rootusrootus|4 months ago

Exactly. Someone way back when decided to call them neural networks, and now a lot of people think that they are a good representation of the real thing. If we make them fast enough, powerful enough, we'll end up with a brain!

Or not.

ctoth|4 months ago

The hubris here isn't CS people making comparisons, it's assuming biological substrate matters. Your brain is doing computation with neurotransmitters instead of transistors. So what? The "chemicals not electricity" distinction is pure carbon chauvinism, like insisting hydraulic computers can't be compared to electronic ones because water isn't electricity. Evolution didn't discover some mystical process that imbues meat with special properties; it just hill-climbed to a solution using whatever materials were available. Brains work despite being kludges of evolutionary baggage, not because biology unlocked some deeper truth about intelligence.

Meanwhile, these systems translate languages, write code, play Go at superhuman levels, and pass medical licensing exams... all tasks you'd have sworn required "real understanding" a decade ago. At some point, look at the goddamn scoreboard. If you think there's something brains can do that these architectures fundamentally can't, name it specifically instead of gesturing vaguely at "inscrutability." The list of "things only biological brains can do" keeps shrinking, and your objection keeps sounding like "but my substrate is special!!1111"

giardini|4 months ago

There are plenty of mathematicians, psychologists, philosophers, physicists et al that are listening in. Perhaps one day, one or more of these will drop the (probably math) that will achieve critical mass (AGI).

There are two periods in history that "feel" like this time to me: - prior to Einstein's theory of relativity and - the uncovering of quantum mechanics.

In both cases bits and pieces of math and science were floating in the air but no one could connect them. It took teams of people/individuals and years of arduous effort to pull it all together.

Today there are a lot more participants. Main difference seems that a lot of them seem to be capitalists!8-))

aughtdev|4 months ago

Yeah, the last 3 years of "We now know how to build AGI" failing to deliver shows that there's something being missed about the nature of intelligence. The "We are all stochastic parrots" people has been awfully quiet recently