top | item 43603261

(no title)

r1chardnl | 11 months ago

I wonder whether it'll be possible to compress enough of the game to make (almost) every possible scenario that you could encounter in the game be playable. Same issue that the previous AI experiment for Minecraft and others had is that objects and enemies seem to pop in and out of nowhere. Could the "learned" probability be high enough for this never to be an issue? You ever think you're seeing something in real life but it's just an optical illusion, it kinda feels like that to me. Obviously this still requires an entire game to be made before you can train on it, but could maybe open up other development and testing of games.

discuss

order

jsheard|11 months ago

> Obviously this still requires an entire game to be made before you can train on it, but could maybe open up other development and testing of games.

The idea of developing a game where the "code" is totally opaque and non-deterministic honestly sounds like an absolute nightmare. How would you even begin to QA something like that?

EvanAnderson|11 months ago

> The idea of developing a game where the "code" is totally opaque and non-deterministic honestly sounds like an absolute nightmare. How would you even begin to QA something like that?

I have a fear that we are going to experience a significant regression in our ability to develop software as new "programmers" normalize the idea of "generating" "code" this way. Some kind of dystopian future where people who think an "is-negative" module is a good idea, but coupled with that module having been "generated" by "AI". Bone chilling.

Re: QA

Clearly we just need another generative "AI" to act as QA in an adversarial capacity to the "AI" generating the "code“. Turtles all the way down.

"The Machine Stops".