(no title)
onceKnowable | 7 years ago
Take an example: your team has developed technology that can identify, with five nines accuracy, the yearly income of a user. That’s innovative, for sure. And nobody is taking the achievement away from your team.
But you don’t need to be Socrates to realize that what you’ve created could be used in horrifying ways. The article is asking creators to take a step back, ignore the excitement of creation for one moment, and consider if what they’re doing should be done. And if the answer is Yes, then decide if a public debate is required to protect the public from that new technology’s misuse.
Now the first reaction from many will be to scoff “fuck that, I’m just building an app here, I’ve got an investment to recoup and money to make”. That’s valid, but it’s an unethical way of looking at the act of creation. Feel free to hide behind the “I’m just doing my job” excuse, but we all know where that ended up.
That is the crux here, politicians can only make laws to protect the public at large from misuse of your newly created technology if they know about it. And the public at large can only demand robust protection (in the form of laws made by their politicians) from misuse of your technology if the public at large knows about it. Without a public debate, new technology could be misunderstood by politicians such that even if they are aware of the technology and even if they legislate, they may not make laws that robustly protect the public.
Thus, from an ethical standpoint, a public debate is needed so that robust laws are demanded and provided in order to protect the public at large from new technologies.
overeater|7 years ago
I think if we start to delve into the actual implementation of this, and look at real examples, it will be clear that this idea of "research that can be used for bad should be not published" leads to a bad type of society.
onceKnowable|7 years ago
The physicists & engineers didn’t identify the MAD inevitability from developing nukes. The public did.
I’m not demonizing the physicists or engineers here either, how were they to know what the outcome of their developments could be?! But the public at large contains people who have other ways of looking at new technologies other than “this is innovative!” or “this will end the war!” this will be profitable!”. It’s these outsider viewpoints that are needed for the public to decide if laws are required in order to robustly protect the public from that new technology.
In specific implementation scenarios, these debates are pointless because we have the nukes & chemical weapons examples from recent history. The ethics are clear. I’m more speaking about developers of new technologies alerting the public at large about potential risks involved with their technology that would make robust laws protecting the public from misuse of their technology prudent.
But, in saying that, there are examples of programmers who could have alerted the public in the public interest, but kept their mouth shut instead such as the developers who developed the emissions cheat software for Volkswagen. Worse, they knew that their actions would lead to deaths among the populations in which those cars were sold because emission-related deaths have been intimately understood for decades now. That is a crime that was perpetrated at the frontlines of software development. Those guys knew what they were doing and their “secrecy” is nothing more than a bloody conspiracy to murder. They didn’t give a fuck how many have died from the emissions released by their cars during the decade they were on sale because “I’m just doing my job building the software I was asked to build” is an adequate excuse in their book.