(no title)
jamessantiago | 7 years ago
I suppose when enough things go wrong with a complex system it's like having runtime errors popup that you can debug against and get a better understanding of what you created, but that first execution is just dangerous enough that you wouldn't want something that's complex enough to be doing anything important. Then again, we might not think something is complex enough until we start running into the "unknown unknowns" of real world usage.
Maybe a somewhat subjective qualifier for what's "complex" could be developed and then the ethical question is "is due diligence being taken to reduce the risks inherent in this complex system?"
No comments yet.