top | item 24741547

(no title)

dtrailin | 5 years ago

For me the thorny ethical issue is not so much working on directly evil stuff like missile targeting software for the military but software that has both military and civilian use. For example, consider a company that develops and RTOS that is used in both cars and missiles, where does that stand morally? Even the most benign technology can be used for evil so it's worth asking the question to what culpability individual bear for who their company sells their software to. In the case of SpaceX, I’m sure much of the code is the same between the military and the NASA use cases.

discuss

order

geofft|5 years ago

I've asked myself this question about even more abstracted/lower-level things, like - is it good to improve Kubernetes, knowing that this makes things better for big businesses and the military (cf. https://thenewstack.io/how-the-u-s-air-force-deployed-kubern...) and not really directly for individual people?

While it's tempting to say that there's no moral dimension this far out, I think that's not quite true - there is a moral dimension to making technology work more smoothly. It increases the ability of anyone with some power to do things at scale. On the one hand, it gives ability for anyone with tiny amounts of power to make use of it (consider the effect of, say, typewriters in the Soviet Union, or even farther back, the printing press in the hands of Luther's supporters). On the other hand, it absolutely increases the ability of people with massive amounts of power to use that power more efficiently (like the "war cloud"). My own view here is that, for those of us whose day job is in improving infrastructure / technical leverage, this increases how important it is for us as members of society to make sure that power is distributed equitably and justly. Since it's indirect, we don't have to advocate for this through what we choose to be directly employed on (the way we would if we were literally working on missile targeting software), and it's often difficult to do so, but we should advocate for this through how we engage with the political process (broadly defined) in the rest of our lives.

There's also a question of what sort of improvements you make to it. Security improvements are generally a thing you can feel good about: people with small amounts of power are much more likely to have security flaws exploited than to use security flaws offensively, so even if there's an argument that hardening some software makes it harder to attack evil businesses or evil governments, it also has the much more practical effect of making it harder for those entities to attack dissidents' personal devices. And similarly for what you choose to do: working on Signal helps the low-power individual much more than working on SELinux, even though both are conceivably dual-use.