top | item 16760841

(no title)

chapill | 8 years ago

>if I help develop A.I. that can be used for all sorts of things, one of which happens to be military-related, is the effort "evil"

There's a famous quote for this:

It should be noted that no ethically-trained software engineer would ever consent to write a DestroyBaghdad procedure. Basic professional ethics would instead require him to write a DestroyCity procedure, to which Baghdad could be given as a parameter.

https://en.wikiquote.org/wiki/Nathaniel_Borenstein

Of which I believe the meaning is yes, it's evil. It's handing a toddler a loaded gun sort of evil. If you DestroyBaghdad, you've limited the harm your program can do to what is specifically required by the situation. DestroyCity is easily misused in the wrong hands and should be carefully considered by ethical programmers.

Doctors solve this by disallowing unethical members of their profession to legally practice. Programmers should consider becoming an ethical profession, because depending on others in the field to do the right thing and police themselves hasn't been working out.

discuss

order

nradov|8 years ago

It's relatively easy to prevent the unlicensed practice of medicine. But anyone can buy a computer and start programming. There's no practical way to require that all programmers adhere to a code of professional ethics.

sterlind|8 years ago

I completely agree with the loaded-gun metaphor, but doctors are a very different kettle of fish.

Doctors are healers. The Hippocratic oath - "do no harm" - is the logical conclusion of the practice of medicine. Medicine heals, which is the opposite of causing harm. Avoiding harm is the only consistent metric of success, which explains the oath's persistence for millennia.

Can you think of a consistent, concrete set of ethics that would draw unanimous support among programmers?

chapill|8 years ago

I think healing has less to do with it than liability. Snake oil salesmen used to be a thing.

What currently sets programmers apart is the lack of liability. Programmers write their own get out of jail free cards. We call them EULAs.

If a doctor screws up and leaves a clamp inside you after surgery, he is sued. If a programmer screws up and leaves a debugging backdoor in a shipped product, nothing.

>Can you think of a consistent, concrete set of ethics that would draw unanimous support among programmers?

I think if programmers can't come to a consensus on that answer, then legislators will do it for them.

If you look around, we're actually witnessing this happening right now. Populist anger has erupted after Equifax, Cambridge Analytica, and Uber. NYT opinion pieces call for changes in liability law around programming.

https://www.nytimes.com/2017/09/11/opinion/equifax-accountab...

And it's not just talk. Changes have already started. Section 230 was recently modified to make small changes in liability of web hosts. In response, Craigslist went full nuclear option in protest and dropped their Personals section. Almost nobody noticed, which means in the next round, law makers will be much more bold in applying more liability to the businesses of programmers.

Google's "Do no evil" was the closest thing I think we've witnessed to a Hippocratic oath for programmers. That's long gone now. Now it's all jerk tech, exploit your users for content and then demonetize them with no recourse or redress.

I don't think the west can get any wilder, so the pendulum is going to go against us from here on out. Programmers should be getting ahead of this, but like all dumb humans, we will sit stupidly. We will only react to immediately obvious consequences instead of preparing for the storm on the horizon.