(no title)
sonink | 1 year ago
I believe it is almost certain that we will make something like this and that they will out-compete us. The bigger problem here is that too few people believe this to be a possibility. And when this becomes certainty becomes apparent to a larger set of people, it might be too late to tone this down.
AI isn't like the Atom Bomb (AB). AB didn't have agency. Once AB was built we still had time to think how to deploy it, or not. We had time to work across a global consensus to limit use of AB. But once AI manifests as AGI, it might be too late to shut it down.
mylastattempt|1 year ago
In my opinion, this is easily noticeable when you try to discuss any system, be it political or economical, that spans multiple countries and interests. People will just revert to whatever is closest to them, rather than being able to foresee a larger cascading result from some random event.
Perhaps this is more of a rant than a comment, apologies, I suppose it would be interesting to have an online space to discuss where things are headed on a logical level, without emotion and ideals and the ridiculous idea that humanity must persevere. Just thinking out what could happen in the next 5, 10 and 99 years.
sonink|1 year ago
Absolutely. Happy to be part of it if you are able to set it up.
hollerith|1 year ago
Could you expand on what you mean by this? Specifically, is it OK with you if progress in AI causes the death of all the original-type human people like you and I?
tivert|1 year ago
I think the bigger problem is that too many people are focused on short term things like personal wealth or glory.
The guy who make the breakthrough that enables the AGI that destroys humanity will probably win the Nobel Prize. That potential Nobel probably looms larger in his mind than any doubts that his achievement is actually a bad thing.
They guy who employs that guy or productionizes his idea will become a mega-billionaire. That potential wealth and power probably looms larger in his mind than any doubts, too.
hollerith|1 year ago
visarga|1 year ago
It's in human hands, we can hardly trust the enemy or even ourselves. We already came close to extinction a couple of times.
I presume when ASI will emerge one of its top priorities will be to stop the crazies with big weapons from killing us all.
mensetmanusman|1 year ago
It would require a civilization to consciously bond with its capability to do so (in such a way that it enhances the survival of the humans serving it). Not sure this would be competition in the normal sense.
rerdavies|1 year ago