(no title)
gurkendoktor | 3 years ago
Even the best-case scenario that some are describing, of uploading ourselves into some kind of post-singularity supercomputer in the hopes of being conscious there, doesn't seem very far from plain extinction.
gurkendoktor | 3 years ago
Even the best-case scenario that some are describing, of uploading ourselves into some kind of post-singularity supercomputer in the hopes of being conscious there, doesn't seem very far from plain extinction.
idiotsecant|3 years ago
And that's OK. We are one step toward the universe understanding itself, but we certainly aren't the final step.
37ef_ced3|3 years ago
Not long from now all creative and productive work will be done by machines.
Humans will be consumers. Why learn a skill when it can all be automated?
This will eliminate what little meaning remains in our modern lives.
Then what? I don't know, who cares?
blueblob|3 years ago
So far, we have task-driven AI/ML. It solves a problem you tell it to solve. Then you, as the engineer, need to make sure it solves the problem correctly enough for you. So it really still seems like it would be a human failing if something went wrong.
So I'm wondering why there is so much concern that AI is going to destroy humanity. Is the theoretical AI that's going to do this even going to have the actuators to do so?
Philosophically, I don't have an issue with the debate, but the "AI will destroy the world" side doesn't seem to have any tangible evidence. It seems to me that people seem to take it as a given that it's possible AI could eliminate all of humanity and they do not support that argument in the least. From my perspective, it appears to be fearmongering because people watched and believed Terminator. It appears uniquely out-of-touch.
JohnPrine|3 years ago
JoeAltmaier|3 years ago
alm1|3 years ago
stefs|3 years ago
londons_explore|3 years ago
If we manage to make a 'better' replacement for ourselves, is it actually a bad thing? Our cousin's on the hominoid family tree are all extinct, yet we don't consider that a mistake. AI made by us could well make us extinct. Is that a bad thing?
gurkendoktor|3 years ago
I guess I wouldn't have been so angry about any of this before I had children, but now I'm very much in favor of prolonged human existence.
JoeAltmaier|3 years ago
goatlover|3 years ago
It's bad for all the humans alive at the time. Do you want to be replaced and have your life cut short? For that matter, why should something better replace us instead of coexist? We don't think killing off all other animals would be a good thing.
> Our cousin's on the hominoid family tree are all extinct, yet we don't consider that a mistake.
It's just how evolution played out. But if there was another hominid still alive along side us, advocating for it's extinction because we're a bit smarter would be considered genocidal and deeply wrong.
tim333|3 years ago
For me immortality a bigger thing than the teraflops. Also I don't think regular humanity would be got rid of but continue in parallel.