I wonder though, even if you could 'transfer' your brain to software, there would be a 'switch', where you'll get rid of your body in the process. Then the (software) copy will start being you.
Then when you die / kill the physical self - the you you dies. The one you are now.
Would it actually make a difference to know that there's another you out there that didn't die? It's just a comforting feeling that you're not really gone, and that your mind keeps "living". I'm just not sure if it's much more comforting than believing in reincarnation or that your soul goes to heaven in my opinion.
Sort of like Transporter technology in Star Trek, wherein everyone ignores the elephant in the room - all their friends died years ago, many times.
Sure, they've a technology which creates perfect facsimiles after de-materialisation, but the Spock you knew and loved (or, at least, tolerated), and were just speaking to on-board a few moments ago, died, along with you, when you beamed down to the planet's surface.
Much like normal neuronal biological growth, we'll need for the process to be as subtle and time-spanning as we can, presumably matching neuronal growth rates, so that we don't notice that we've killed ourselves in the process.
"If you always knew the wise thing to do, would you ever choose the unwise path?"
Yes, people do that. When someone cheats on his/her spouse, is it because it seems to be the wise path? No, it's quite likely that such a person knows that it is wrong, unethical, and also very unwise - and still do it. And we all do this in smaller things all the time: e.g. exercising our ego and being rude to each other when we know that it will only make things worse; eating a lot of junk food when we know that it will make us unhealthy - and so on.
People are not just rational minds: we also have instincts and emotions. Free will is not about finding the solution to a logical problem: it is about balancing between these things.
Just think about how you could define an algorithm for solving an ethical dilemma. What is your algorithm? What is Mother Theresa's algorithm? What would happen if everybody had the same (supposedly ideal) algorithm, and would kill / self-sacrifice in the same kinds of situations? Would we survive at all?
If transferring our minds to software would be possible, would we want to transfer our emotions or instincts to this software at all? Would we really want to live as robots, or would we want to keep something "human" too?
Strange, I consider myself somewhat 'up-to-date on technology' and think we're further away from full consciousness transference to hardware than I've ever thought before.
The more I learn about how complex and how.. dense.. the brain is, the more decades I add onto that singular point where I think I might be able to escape my meat vessel.
Sadly!
- ed
not that I don't necessarily think computer complexity might reach parity with the human mind in my lifetime, more it being the not so trivial matter of creating whatever transference mechanic is tasked with copying a human mind over at the multi-connected neuron level.
It may be possible to duplicate a humans thought patterns in silicon, but I'm not sure this will cause self awareness.
Self awareness hasn't yet been satisfactorily explained to me in terms of brain patterns... I'm not religious, nor superstitious, so I assume self awareness must rely on physical phenomenon, but what causes it as far as I know remains a mystery. Maybe a copy becomes conscious. That would be cool. But I'm skeptical it's going to be that easy.
[+] [-] gingerlime|11 years ago|reply
Then when you die / kill the physical self - the you you dies. The one you are now.
Would it actually make a difference to know that there's another you out there that didn't die? It's just a comforting feeling that you're not really gone, and that your mind keeps "living". I'm just not sure if it's much more comforting than believing in reincarnation or that your soul goes to heaven in my opinion.
[+] [-] detritus|11 years ago|reply
Sort of like Transporter technology in Star Trek, wherein everyone ignores the elephant in the room - all their friends died years ago, many times.
Sure, they've a technology which creates perfect facsimiles after de-materialisation, but the Spock you knew and loved (or, at least, tolerated), and were just speaking to on-board a few moments ago, died, along with you, when you beamed down to the planet's surface.
Much like normal neuronal biological growth, we'll need for the process to be as subtle and time-spanning as we can, presumably matching neuronal growth rates, so that we don't notice that we've killed ourselves in the process.
[+] [-] gyim|11 years ago|reply
Yes, people do that. When someone cheats on his/her spouse, is it because it seems to be the wise path? No, it's quite likely that such a person knows that it is wrong, unethical, and also very unwise - and still do it. And we all do this in smaller things all the time: e.g. exercising our ego and being rude to each other when we know that it will only make things worse; eating a lot of junk food when we know that it will make us unhealthy - and so on.
People are not just rational minds: we also have instincts and emotions. Free will is not about finding the solution to a logical problem: it is about balancing between these things.
Just think about how you could define an algorithm for solving an ethical dilemma. What is your algorithm? What is Mother Theresa's algorithm? What would happen if everybody had the same (supposedly ideal) algorithm, and would kill / self-sacrifice in the same kinds of situations? Would we survive at all?
If transferring our minds to software would be possible, would we want to transfer our emotions or instincts to this software at all? Would we really want to live as robots, or would we want to keep something "human" too?
[+] [-] detritus|11 years ago|reply
The more I learn about how complex and how.. dense.. the brain is, the more decades I add onto that singular point where I think I might be able to escape my meat vessel.
Sadly!
- ed
not that I don't necessarily think computer complexity might reach parity with the human mind in my lifetime, more it being the not so trivial matter of creating whatever transference mechanic is tasked with copying a human mind over at the multi-connected neuron level.
[+] [-] jqm|11 years ago|reply
Self awareness hasn't yet been satisfactorily explained to me in terms of brain patterns... I'm not religious, nor superstitious, so I assume self awareness must rely on physical phenomenon, but what causes it as far as I know remains a mystery. Maybe a copy becomes conscious. That would be cool. But I'm skeptical it's going to be that easy.