The Bobiverse books by Dennis E. Taylor touch upon this, as Bob is a human that becomes the AI in a replicant that is the intelligence behind a von Neumann probe sent out from earth. One of the ways that he maintains his sanity is to build a simulated VR environment to help maintain his sanity.
I personally think that the really hard part in creating an AGI is going to be training. As an adult human, one can look at an object and instantly know what its texture is going to feel like on one's fingers or even lips, how it will bounce and if it will shatter. How did we gain that knowledge? As babies we put absolutely everything we saw into our mouths and sucked or chewed on it. As kids we played for hours on end with all kinds of toys and household objects. Coming up with a way of training an AGI to be human level will be hard, as there is such a vast scope of knowledge in "common sense" that will be exceedingly difficult to implant into an AGI without the AGI having an ability to interact with the real world.
On the other hand, once that body of knowledge becomes an available training data set, evolution can take off at speeds otherwise impossible in the real world.
It's not 'like' anything, computer programs don't have true feelings, though they can certainly be programmed to act as if they do. It can be programmed to say 'hey, life is great' but it means nothing. They have no soul, no spark from the divine.
Those early Sierra games, the ones that used typed text to interact with the environment, were really inspirational. I was very disappointed when they replaced that CLI-like interface with mouse-driven icons. No doubt the icons were more user friendly and had none of the flaws of trying to enter the right text, but it still felt like something was lost.
Relatedly, I'm working on natural language understanding, which I believe is key to AGI. https://lxagi.com
It would depend on the legality of the situation -- like in Gibson's Neuromancer where a fully-intelligent AI has Swiss citizenship and yet the software and supercomputer it is running on is owned by a corporation -- “Like, I own your brain and what you know, but your thoughts have Swiss citizenship. Sure. Lotsa luck, AI.” as a character puts it.
bcrl|3 years ago
I personally think that the really hard part in creating an AGI is going to be training. As an adult human, one can look at an object and instantly know what its texture is going to feel like on one's fingers or even lips, how it will bounce and if it will shatter. How did we gain that knowledge? As babies we put absolutely everything we saw into our mouths and sucked or chewed on it. As kids we played for hours on end with all kinds of toys and household objects. Coming up with a way of training an AGI to be human level will be hard, as there is such a vast scope of knowledge in "common sense" that will be exceedingly difficult to implant into an AGI without the AGI having an ability to interact with the real world.
On the other hand, once that body of knowledge becomes an available training data set, evolution can take off at speeds otherwise impossible in the real world.
mint2|3 years ago
Acronyms in headline are not the place to put them unless it’s very certain the target audience is innately familiar. Guess I’m not that audience.
porknubbins|3 years ago
Especially when the headline poses a counterfactual/impossible question they should help as much as possible.
abdulhaq|3 years ago
mckirk|3 years ago
martijnvds|3 years ago
jasfi|3 years ago
Relatedly, I'm working on natural language understanding, which I believe is key to AGI. https://lxagi.com
jasfi|3 years ago
ravi-delia|3 years ago
bmn3267|3 years ago
jhbadger|3 years ago