top | item 31357355

(no title)

ReadEvalPost | 3 years ago

Do you love? Do you dance? Do you desire? Do you rage? Do you weep? Do you choose? Every moment of your existence you exert your will on the world.

A denial of will is a denial of humanity. I want nothing of a science that would do such a thing.

discuss

order

Surgeus|3 years ago

This points out something very related that I think about a lot - can you prove to me that you do any of those things? Can I prove to you that I do any of those things? That either of us have a will? When would you be able to believe a machine could have these things?

In Computing the Mind by Shimon Edelman is a concept that I've come to agree with, which is at some point you need to take a leap of faith in matters such as consciousness, and I would say it extends to will as well (to me what you've described are facets of human consciousness). We take this leap of faith every time we interact with another human; we don't need them to prove they're conscious or beings with a will of their own, we just accept that they possess these things without a thought. If machines gain some form of sentience comparable to that of a human, we'll likely have to take that leap of faith ourselves.

That said, to claim that will is necessary for intelligence is a very human-centered point of view. Unless the goal is specifically to emulate human intelligence/consciousness (which is a goal for some but not all), "true" machine intelligence may not look anything like ours, and I don't think that would necessarily be a bad thing.

dekhn|3 years ago

Not just consciousness- all of science requires a leap of faith- the idea that human brains can comprehend general universal causality. There is no scientific refutation for Descartes' Great Deceiver- it's taken as a given that humans could eventually overcome any https://en.wikipedia.org/wiki/Evil_demon through their use of senses and rationality on their own.

I have long worked on the assumption that we can create intelligences that no human could deny have subjective agency, while not being able to verify that. I did some preliminary experiments on idle cycles on Google's internal TPU networks (IE, large-scale brain sims using tensorflow and message passing on ~tens of pods simultaneously) that generated interesting results but I can't discuss them until my NDA expires in another 9 years.

dekhn|3 years ago

Appeals to humanity do not convince me of anything. I do all those things (well, I dance terribly) but again, those are not indications of will, and it's entirely unclear what magical bit in our bodies is doing that, when computers cannot.

Even if you don't want to have anything with such a science, such a science will move on without you.

"A version of an oft-told ancient Greek story concerns a contest between two renowned painters. Zeuxis (born around 464 BC) produced a still life painting so convincing that birds flew down to peck at the painted grapes. A rival, Parrhasius, asked Zeuxis to judge one of his paintings that was behind a pair of tattered curtains in his study. Parrhasius asked Zeuxis to pull back the curtains, but when Zeuxis tried, he could not, as the curtains were included in Parrhasius's painting—making Parrhasius the winner."

tsimionescu|3 years ago

Why would an AGI be unable to do these things? Sure, if you believe in a transcendental soul (mind/body dualism) then you can argue that it can't because Divinity has simply not endowed it with such, and that claim can neither be proven nor disproven. But it's an extra assumption that gets you nothing.

Note that I personally believe we are more than a century away from an AGI, and think the current models are fundamentally limited in several ways. But I can't imagine what makes you think there can't be a Ghost in the Machine.