I once worked for a large industrial group in Europe. The kind that has a bit of a piece of every little pie there is - military, transportation, etc. I was pretty happy working there .. until I got a demo from the 'defence' group.
They demonstrated the willingness to push the company's technology into heinous, heinous territory. The kind of thing where a drone would be able to follow a single person in a crowd, and target them for execution - unguided, of course.
I quit the next day. Those of us who make technology, need to be very sure we see that it is not used destructively against the human species. The responsibility is very, very high. And, the danger is extreme. These people were revelling in the fact that they could develop targeted assassination drones and sell them to any country in the world.
I'm not surprised. After finishing my degree, I went to optimize a couple of production lines. I didn't get much detail and thought it was for plastic/glass bottles (filling liquids and moving them around - many tiny grabbers, simple movement, not that wide tracks).
After about 2 months of work, and after the production line got parallelized and speed drastically increased I went to see it work.
What I saw shocked me and I immediatelly quit. It was a production line for handling of female and male young and grown chicks. Debeaking, throat slitting. I was absolutely shocked how none of the superiors told me exactly which product was being handled.
After seeing the horrific product of my work I quit.
Since then, I'm not surprised, given what horrors we do to living animals, that we are ready to do them to each other.
I doubted the meaning of my work at university, what did I do? Spend 4 years at college to create killing machines? I didn't think I'd ever do that.
That's the thing isn't it ? They can't do anything alone.
It always amazes me that those big entities exist, because they require such a huge highly educated and skilled human power. What are all those genius at the NSA thinking ? I can't imagine somebody smart enough to work here is not smart enough to understand the consequences of working there. So why are they not quitting ?
Social pressure and money are part of the equation, certainly. I remember when I turned down a Google interview, my close circle though it was weird that I did that, even more for ethical reason.
Having the best toys, budgets and projects certainly helps as well.
I applaud your morality. In terms of moving the needle towards a more moral world, do you think employee resignations have a positive effect or does it create a less moral survivor bias where the people who are left have fewer limits or concerns?
I'm sure disclosure would be illegal per your employee contract, but are there any other steps concerned employees can take?
This is never true. No government will allow its companies (and in this context that's how it's seen) to sell weapons to countries they don't approve of. That said, apparently the defense industry is often irresponsible at best.
Google leaders are either very naive and actually think the Pentagon wouldn't repurpose their tech for killing people, or they know exactly that this is how it will used but they agreed to sell it anyway, because money.
I'm much more inclined to believe it's the latter.
I can beat you to death with a hammer or stab you to death with a knife. Should we not build hammers and knives? Should hammers and knives be subject to KYC laws?
Sure you can argue that it's different because those are things the common man can get whereas on the state can afford surveillance dragnets and drones but it wasn't that long ago that only the state could afford computers.
Edit: Apparently I struck a nerve.
Technology transfers between military and civilian application all the time. Propeller technology that helped submarines that are now obsolete stay quiet is fine tuned in a different manner to yield more environmentally friendly watercraft. A drone that can disperse insecticides on only the crops that need it can deliver chemical weapons with some slightly different fine tuning.
An 1984 (or 2018 UK if you like) surveillance and law enforcement system could be used to track down corruption in government, suppressing dissidents, identifying insider trading, identifying human tracking, etc. It all depends on who's using it. (I personally don't trust any government to properly wield that kind of power.)
The technology doesn't care. It's all how you use it.
I’d like to offer the point of view that if the drones become better and more surgical in their precision, it would reduce civilian casualties.
Like it or not the world is full of extremists who would like nothing more than to hurt innocent people. There is no “oh just send the cops and arrest them!” route to take.
Shit, just look at the time Osama bin Laden could have been bombed with a tomahawk missile during Clinton’s presidency. He didn’t do it because of the potential to kill a Saudi prince he was meeting at that time.
Would those angry Googlers be against surgically killing Osama? I think not.
Better drone software might help track a potential target and present with the optimal window in which a target could be shot and have reduced civilian casualties. It could also present with better intel to let a surgical ground strike which would put more American soldiers at risk but would allow for better intel and again less civilian deaths.
Lastly, it could offer new knowledge and experience in tracking humans with drones during humanitarian disasters. It could also help in tracking victims of kidnapping, are the Googlers opposed to rescuing the hundreds and thousands kidnapped by Boko Haram and company?
Who is going to go into the African heart of darkness to rescue those people? Is it the arm chair Googlers who pretend to know better?
> I’d like to offer the point of view that if the drones become better and more surgical in their precision, it would reduce civilian casualties.
It would reduce collateral casualties per target attacked, which would make the drones easier to use with looser target selection criteria, which might both increase number of targets attacked and increase the number and ratio of incorrect-target-selection casualties.
The law of unintended consequences is most likely to sneak up and bite you when you only bother to consider first-order effects.
I agree with you partly, and I think often people who are critical of all military actions are not considering the importance of a strong military in deterring large scale conflict.
But I think it's also true that carrying out military operations (even precise ones) in unstable parts of the world helps violent extremists gain support among the broader population and does nothing to help alleviate the instability that gives rise to these extremists in the first place.
When I consider how few deaths there actually are from extremist groups operating in Western countries it makes me wonder if the scale of our response is really appropriate to the severity of the issue, and whether our actions aren't helping to perpetuate the very issues they're intended to address.
Can you guarantee the technology won't fall into enemy hands?
Can you guarantee our government won't initiate illegal aggression?
Won't subvert democracies?
Won't have another Gulf of Tonkin?
Won't target the families of terrorists, as our current President has suggested?
No, you can't guarantee those things.
Guns don't kill people, people do. And people are sometimes evil, and sometimes break the law, and sometimes make mistakes. And sometimes the gun is stolen. So maybe some engineers don't want their company to make any guns.
You don't get to ask, "Would those angry Googlers be against the technology always being used the way they intended?"
Instead you have to ask, "Is it possible for this technology to be used in ways those Googlers would object to?" And of course the answer is yes.
Many on the Manhattan project thought that bombing Nagasaki was completely unnecessary. Some probably thought Hiroshima was unnecessary, that a demonstration of the power would be sufficient.
But again this assume the people giving order to the drone are the good guys.
But I never seen any good guys in my history books or in the news.
Hence I always assume, when given a power to somebody, that the person doesn't have my best interest in mind.
Let's all remember it's possible any of our country become one day a dictatorship. Just because we enjoyed a lot of freedom for the last decades doesn't exempt us from still working like we can loose it at any moment. Because we definitly can.
More pragmatically, with powerful AI, giant communications nets, huge database of everything and everybody, cameras with facial detection and wire typing everywhere, do you really want to add drones to the collections of what the power that be can do ?
The problem is people believe in a fair world and that our foes operate with the same values so that if we accommodate them we’ll be okay and therefore we should not develop means to better wage assymetrical warfare because that’s an unfair advantage. Moreover, what if they are in the right and we are in the wrong, in terms of history.
They fear that this current admin and future adminis may depend on as sec Clinton put it “droning” people we simply disagree with rather than actual military adversaries.
The main question is effectiveness of the system, given some baselines.
> Like it or not the world is full of extremists who would like nothing more than to hurt innocent people.
"Full of"? The world is more peaceful than it's ever been. Extremists do hurt innocent people, and we should not ignore them. But with each choice that we need to make, we should carefully consider pros and cons. Is there really a net benefit here?
> Would those angry Googlers be against surgically killing Osama?
Probably.
I can't find the quote, but I was reading something about the troubles with the IRA, and the response, that really stuck with me. The author said something like, "When you use lethal force against terrorists it lets them feel that it's fair to use it against you."
Once you make the mental flip it becomes really easy to think up defense systems that work without causing any casualties or deaths at all, not our guys, nor civilians, nor the enemy. And if we just can't live with that, we can always kill them later: https://en.wikipedia.org/wiki/Saddam_Hussein#Execution
thats terrible reasoning, most likely formed by the skewed view that targeted assasinations are needed and cool. Why does the US need to be at war all the fucking time.. there are so many problems with drones doing targetted killings, and so many many more, if they should be powered by AI that get cucumber right 98% of the time. This is such a horrible, horrible, horrible idea.
>Shit, just look at the time Osama bin Laden could have been bombed with a tomahawk missile during Clinton’s presidency. He didn’t do it because of the potential to kill a Saudi prince he was meeting at that time.
Here's an article on that very thing. It didn't mention anything about a Saudi Prince though.
But your points are valid. Like anything, it could be used for non-objectional purposes. How morally objectionable do you think the whole of the use of this technology would be, given the country using it, etc? Do you think it would help the US and allies become more or less authoritarian? How long until you think it would be given / sold to local police departments like other military equipment?
And the internet has allowed for much more "targeted surveillance" too, from across the world.
Guess how it's been used? It's been used to "target" everyone. Why? Because it's gotten cheap and easy enough to use on many more people at once - just how automated drone strikes will be soon.
I think you're naive if you think this will "improve" war conditions. Here's one story that may bring you back to reality, and about how these automated drone strikes are more likely to be used in the future:
The problem with dangerous technologies isn't when they're used under the best possible circumstances. It's all of the others. Nuclear bombs can propel spaceships or dig canals, but nobody is protesting that.
> the world is full of extremists
> Osama bin Laden could have been bombed
What you seem to be saying, in what we hear all the time, is spending money and doing work that increased the power and capabilities of the US military will makes us all better off.
You are also seem to be saying that Osama bin Laden is an extremist.
But in the 1970s, as Afghanistan was working towards becoming a more secular society, the US military and intelligence agencies were arming Osama bin Laden and his fellow jihadis, the proto Taliban and proto Al Qaeda. Who wanted, among other things, for the secularization of Afghanistan to stop, and for an Islamic dominated government to come in. An effort which the US succeeded in, along with their partner Osama bin Laden.
This being the case, I am not exactly sure when bin Laden and people like him became extremists. I suppose it was after the US began it's military occupation of Saudi Arabia. Osama bin Laden opposed the US military occupation of his country.
This may sound equivocal about bin Laden, but the US is more equivocal about bin Laden. I think he never should have been armed by the US. People of a like mind said as much then. Others disagreed.
In other words, if the US is making political errors (or is not making errors and is pursuing negative goals), more power and capability to carry out those erroneous policies will not help matters.
For example, Trump just escalated the conflict in the Middle East this week, which only satisfies religious fundamentalists. Handing him more power to do so will not help things, it will just mean more 9/11s in response to the blow he just landed against Arabs/Muslims.
> I’d like to offer the point of view that if the drones become better and more surgical in their precision, it would reduce civilian casualties.
I am not sure why some people instantly assumes that the whole purpose of making drones more autonomous is to make them more precise: historically, the DoD/US military have made virtually zero efforts to even try to reduce civilian "casualties".
Maybe I am too cynical, but I genuinely think that the military is only willing to invest in technology that would help them expand current and future operations, disregarding the impact that these will have in the civilian population of foreign territories... probably because it's orders of magnitude cheaper to just pay someone to write a public statement denying every statistics published by neutral NGOs around the world.
I wouldn’t say that developing military hardware necessarily negates the “don’t be evil” principal (especially if the developed articles are dual use, for both military and civilian applications). The western world, our principals and values have prospered for more than half a century in Pax Americana afforded, in a large part, by the prosperous US Military Industrial Complex.
I totally get the objection to developing combative AI - that’s a separate ethical question - but you can contribute to the military and still maintain your humane values.
Given that the US military has killed nearly 4,000 people with drone strikes in Pakistan over the past decade, a country in which no formal military conflict exists, nor any formal enemy, just vague accusations of terrorist networks (and likely a bunch of political dissidents fed to them by the Pakistani government), I am really wondering where the "Pax" is coming from. Because if you do the math the odds of any given person knowing someone who was killed by a drone strike, or someone who knows someone who was, are pretty damn high. The US has brought hell to Pakistan.
Kudos to those who is ready to stand for their principles.
If you are at G and thinking whether you should resign or not, remembers this - the market for AI talent is super hot. You will immediately find lots of great and challenging AI work pushing humanity forward
You will immediately find lots of great and challenging AI work pushing humanity forward.
They'll find jobs quickly, that's for sure.
But "work pushing humanity forward"? There's precious little of that in any skill sector - not at FAANG salary levels, anyways. The vast bulk of the work that the vast majority of us do is simply about pushing the investor's balance sheets forward - not "humanity".
Corollary: plenty of skilled engineers with fewer moralistic constraints will jump at the chance to do interesting work for high pay. For a company as large and wealthy as Google, they can continue to raise offer salaries until they are adequately staffed.
There is a school of thought that recommends “moral” people doing “immoral” work because if those people left then other “immoral” people will take those jobs and more readily implement “immoral” features. So the “moral” engineers have an incentive to stay and act as a front line against “immoral” actions, or at least have an insider’s position for whistleblowing.
Military drones are here to stay, and whether or not the US builds them, other military powers certainly will.
Hard to tell if those people were planning on leaving anyway, and just wanted to make a splash. How many people quit each week in a company with 90,000 employees?
Only a small percent of engineers at google are AI experts. Sure, more people use it, but they probably just make a service call and get some magic results back.
Definitely scary, but what happens if enemy states create this technology before us? That's why we need a defence sector that does, in fact, build these tools. If both sides have the technology then a stalemate is reached. Look at nuclear weapons; we can all bomb each other out of existence, and so, no one does. Now look at the countries that lagged behind.
For years we have known that US drone strikes can very accurately kill anonymous people - often civilians (such as women and children):
"Every independent investigation of the strikes has found far more civilian casualties than administration officials admit. Gradually, it has become clear that when operators in Nevada fire missiles into remote tribal territories on the other side of the world, they often do not know who they are killing, but are making an imperfect best guess." [1]
"Leaked military documents reveal that the vast majority of people killed have not been the intended targets, with approximately 13% of deaths being the intended targets, 81% being other "militants", and 6% being civilians." [2]
"strikes have killed 3,852 people, 476 of them civilians. But those counts, based on news accounts and some on-the-ground interviews, are considered very rough estimates" [1]
Not only that, we bomb inside of countries that are (sort of?) our allies, without informing them and without their consent:
"Pakistan's Prime Minister, Nawaz Sharif, has repeatedly demanded an end to the strikes, stating: "The use of drones is not only a continual violation of our territorial integrity but also detrimental to our resolve and efforts at eliminating terrorism from our country" [2]
You seem to be suggesting that the fact of one's residency in a country should influence one's moral evaluation of that country, that there should be a different set of standards applied to one's own country than a foreign country. I believe that standards of good and evil should be applied uniformly to all nation states. If anything, one should hold one's own country to a higher standard since you have the most hope of influencing your own nation's behavior.
Arguably, it is precisely the inability to consider international conflict from a vantage point outside a nationalist worldview that is the root cause of all external wars.
It would be puerile for a Swedish person, far from it for an American. The fact that it’s the military of my own country has little to do with how evil it is or isn’t.
Watching too many movies and too much TV has the converse effect - for the most part, visual media is full of nationalistic ra-ra nonsense.
The US military (As have all other militaries that have done anything of note) has done some ridiculously evil things. The drone strike program isn't the worst of them by any means, but it's not its brightest moment, either.
> Thinking that the military of your own country is "evil" seems a bit puerile to me.
How true this is in a specific case depends on what that country’s military is doing at the time; categorically dismissing the idea that a nation's military can reasonably be seen as evil by a citizen seems more than a bit naive to me.
In 2015 Alphabet changed their motto to "do the right thing". If you interpret "right" as "correct", then "doing the right thing" can mean 'correctly' bombing civilians.
You'd think that people would know that concepts like "good" and "evil" are complex and somewhat subjective. There are plenty of times when you can kill someone and not be considered "evil".
At the very least, most people agree that killing in self defense is not evil.
It's not directly killing people. That's like saying helping improve GPS satellites kills people because weapons systems use GPS and would benefit from increased accuracy.
I wish there was a legally-enforcable version of Douglas Crockford's "Good, not Evil" license. I haven't released any source code that could have military applications yet, but if I ever do, I want to make it 100% clear that it's not to be used for any task related to the killing or injuring of other people. We're in a unique position as programmers where even the tiniest bit of our code can affect thousands or millions of people across the globe, and this terrifies me.
The GPL has already shown us that a license has the power to change culture and behavior (in however small a way). We should be able to extend this approach to other values we hold dear.
I just want to congratulate those Googlers. It isn't something usual to quit a good job due to ethical concerns. The World would be a lot better if there were more people like you.
Somebody else will fill the gap. It is simply a consequence of military science: If a human in the loop makes combat systems less effective, then other countries will seek the advantage over others by removing the human from the loop. It's a classic arms race at this point.
... This is a pandoras box that has already been opened I am afraid.
I've been asked to work for military industry companies before. And I have always declined for ethical reasons.
But as I sit here and think about it, I wonder if its a good thing that a person like myself (that believes I'm on the ethical high-ground) decline these types of jobs.
Someone is going to take the job. Perhaps someone less skilled than myself, perhaps someone less ethical than myself ? What is the result of that ?
As another poster wrote, it's "good" that the targetting gets more precise, meaning less collateral damage.
But to each his own. We need to be able to sleep at night aswell. And that to me also seems like a really good reason to decline.
I'm kind of on the fence about wanting to work in that industry.
> I've been asked to work for military industry companies before. And I have always declined for ethical reasons.
But as I sit here and think about it, I wonder if its a good thing that a person like myself (that believes I'm on the ethical high-ground) decline these types of jobs.
>
> Someone is going to take the job. Perhaps someone less skilled than myself, perhaps someone less ethical than myself ? What is the result of that ?
Whenever I find myself in this line of thought, I always remember: I am not that special. The impact that anyone has on a workplace is small and mostly inconsequential--more important is group momentum and culture. The likelihood that you'd do bad in a bad workplace is much higher than that you'd be able to stand fast and do good--the world just does not work that way.
It depends on how much of an impact you'll think you can make, I'm always of the mind that these defence contractors primary motivations are what their customers desire (i.e. military), rather than making considerations for anyone on staff.
I would be curious to see the list of employees who quit for this purpose. They are making a statement. They might as well publicly disclose their identities to inspire more people.
Also, wondering. Are most of them financially independent to have made this decision? When money is not an worry, people have freedom to truly align themselves externally with their internal core values. If you are constantly worried paying rent or securing your kids future - people make compromises. That is not ideal to build a great society.
People who disagree with you often disagree with each other as well. This doesn't come through in a media (of all political leanings) that tends toward a narrow set of well-funded viewpoints.
joebadmo|7 years ago
detaro|7 years ago
mmjaa|7 years ago
They demonstrated the willingness to push the company's technology into heinous, heinous territory. The kind of thing where a drone would be able to follow a single person in a crowd, and target them for execution - unguided, of course.
I quit the next day. Those of us who make technology, need to be very sure we see that it is not used destructively against the human species. The responsibility is very, very high. And, the danger is extreme. These people were revelling in the fact that they could develop targeted assassination drones and sell them to any country in the world.
Heinous.
brolover|7 years ago
After about 2 months of work, and after the production line got parallelized and speed drastically increased I went to see it work.
What I saw shocked me and I immediatelly quit. It was a production line for handling of female and male young and grown chicks. Debeaking, throat slitting. I was absolutely shocked how none of the superiors told me exactly which product was being handled.
After seeing the horrific product of my work I quit.
Since then, I'm not surprised, given what horrors we do to living animals, that we are ready to do them to each other.
I doubted the meaning of my work at university, what did I do? Spend 4 years at college to create killing machines? I didn't think I'd ever do that.
sametmax|7 years ago
It always amazes me that those big entities exist, because they require such a huge highly educated and skilled human power. What are all those genius at the NSA thinking ? I can't imagine somebody smart enough to work here is not smart enough to understand the consequences of working there. So why are they not quitting ?
Social pressure and money are part of the equation, certainly. I remember when I turned down a Google interview, my close circle though it was weird that I did that, even more for ethical reason.
Having the best toys, budgets and projects certainly helps as well.
But still, I wonder.
everdev|7 years ago
I'm sure disclosure would be illegal per your employee contract, but are there any other steps concerned employees can take?
tormeh|7 years ago
This is never true. No government will allow its companies (and in this context that's how it's seen) to sell weapons to countries they don't approve of. That said, apparently the defense industry is often irresponsible at best.
c3534l|7 years ago
walshemj|7 years ago
mtgx|7 years ago
I'm much more inclined to believe it's the latter.
dsfyu404ed|7 years ago
Sure you can argue that it's different because those are things the common man can get whereas on the state can afford surveillance dragnets and drones but it wasn't that long ago that only the state could afford computers.
Edit: Apparently I struck a nerve.
Technology transfers between military and civilian application all the time. Propeller technology that helped submarines that are now obsolete stay quiet is fine tuned in a different manner to yield more environmentally friendly watercraft. A drone that can disperse insecticides on only the crops that need it can deliver chemical weapons with some slightly different fine tuning.
An 1984 (or 2018 UK if you like) surveillance and law enforcement system could be used to track down corruption in government, suppressing dissidents, identifying insider trading, identifying human tracking, etc. It all depends on who's using it. (I personally don't trust any government to properly wield that kind of power.)
The technology doesn't care. It's all how you use it.
whb07|7 years ago
Like it or not the world is full of extremists who would like nothing more than to hurt innocent people. There is no “oh just send the cops and arrest them!” route to take.
Shit, just look at the time Osama bin Laden could have been bombed with a tomahawk missile during Clinton’s presidency. He didn’t do it because of the potential to kill a Saudi prince he was meeting at that time.
Would those angry Googlers be against surgically killing Osama? I think not.
Better drone software might help track a potential target and present with the optimal window in which a target could be shot and have reduced civilian casualties. It could also present with better intel to let a surgical ground strike which would put more American soldiers at risk but would allow for better intel and again less civilian deaths.
Lastly, it could offer new knowledge and experience in tracking humans with drones during humanitarian disasters. It could also help in tracking victims of kidnapping, are the Googlers opposed to rescuing the hundreds and thousands kidnapped by Boko Haram and company?
Who is going to go into the African heart of darkness to rescue those people? Is it the arm chair Googlers who pretend to know better?
dragonwriter|7 years ago
It would reduce collateral casualties per target attacked, which would make the drones easier to use with looser target selection criteria, which might both increase number of targets attacked and increase the number and ratio of incorrect-target-selection casualties.
The law of unintended consequences is most likely to sneak up and bite you when you only bother to consider first-order effects.
jeffreyrogers|7 years ago
But I think it's also true that carrying out military operations (even precise ones) in unstable parts of the world helps violent extremists gain support among the broader population and does nothing to help alleviate the instability that gives rise to these extremists in the first place.
When I consider how few deaths there actually are from extremist groups operating in Western countries it makes me wonder if the scale of our response is really appropriate to the severity of the issue, and whether our actions aren't helping to perpetuate the very issues they're intended to address.
VikingCoder|7 years ago
Can you guarantee our government won't initiate illegal aggression?
Won't subvert democracies?
Won't have another Gulf of Tonkin?
Won't target the families of terrorists, as our current President has suggested?
No, you can't guarantee those things.
Guns don't kill people, people do. And people are sometimes evil, and sometimes break the law, and sometimes make mistakes. And sometimes the gun is stolen. So maybe some engineers don't want their company to make any guns.
You don't get to ask, "Would those angry Googlers be against the technology always being used the way they intended?"
Instead you have to ask, "Is it possible for this technology to be used in ways those Googlers would object to?" And of course the answer is yes.
Many on the Manhattan project thought that bombing Nagasaki was completely unnecessary. Some probably thought Hiroshima was unnecessary, that a demonstration of the power would be sufficient.
sametmax|7 years ago
But I never seen any good guys in my history books or in the news.
Hence I always assume, when given a power to somebody, that the person doesn't have my best interest in mind.
Let's all remember it's possible any of our country become one day a dictatorship. Just because we enjoyed a lot of freedom for the last decades doesn't exempt us from still working like we can loose it at any moment. Because we definitly can.
More pragmatically, with powerful AI, giant communications nets, huge database of everything and everybody, cameras with facial detection and wire typing everywhere, do you really want to add drones to the collections of what the power that be can do ?
mc32|7 years ago
They fear that this current admin and future adminis may depend on as sec Clinton put it “droning” people we simply disagree with rather than actual military adversaries.
The main question is effectiveness of the system, given some baselines.
wyldfire|7 years ago
"Full of"? The world is more peaceful than it's ever been. Extremists do hurt innocent people, and we should not ignore them. But with each choice that we need to make, we should carefully consider pros and cons. Is there really a net benefit here?
ondrae|7 years ago
They most likely don't want to work for a company involved in any killing.
carapace|7 years ago
Probably.
I can't find the quote, but I was reading something about the troubles with the IRA, and the response, that really stuck with me. The author said something like, "When you use lethal force against terrorists it lets them feel that it's fair to use it against you."
We won't defeat violence by violence.
I think we need to challenge ourselves to become less bloodthirsty as we become more technologically capable. (If only to set a good example for our progeny... https://en.wikipedia.org/wiki/I_Have_No_Mouth,_and_I_Must_Sc...)
Once you make the mental flip it becomes really easy to think up defense systems that work without causing any casualties or deaths at all, not our guys, nor civilians, nor the enemy. And if we just can't live with that, we can always kill them later: https://en.wikipedia.org/wiki/Saddam_Hussein#Execution
martin_bech|7 years ago
Clubber|7 years ago
Here's an article on that very thing. It didn't mention anything about a Saudi Prince though.
http://www.latimes.com/nation/nationnow/la-na-nn-bill-clinto...
But your points are valid. Like anything, it could be used for non-objectional purposes. How morally objectionable do you think the whole of the use of this technology would be, given the country using it, etc? Do you think it would help the US and allies become more or less authoritarian? How long until you think it would be given / sold to local police departments like other military equipment?
mtgx|7 years ago
Guess how it's been used? It's been used to "target" everyone. Why? Because it's gotten cheap and easy enough to use on many more people at once - just how automated drone strikes will be soon.
I think you're naive if you think this will "improve" war conditions. Here's one story that may bring you back to reality, and about how these automated drone strikes are more likely to be used in the future:
https://www.middleeastmonitor.com/20180501-journalists-chall...
Do you think this guy was a victim of "inaccurate" drone attacks and "collateral damage"?
scythe|7 years ago
vidarh|7 years ago
And some of them are parts of governments.
imbokodo|7 years ago
What you seem to be saying, in what we hear all the time, is spending money and doing work that increased the power and capabilities of the US military will makes us all better off.
You are also seem to be saying that Osama bin Laden is an extremist.
But in the 1970s, as Afghanistan was working towards becoming a more secular society, the US military and intelligence agencies were arming Osama bin Laden and his fellow jihadis, the proto Taliban and proto Al Qaeda. Who wanted, among other things, for the secularization of Afghanistan to stop, and for an Islamic dominated government to come in. An effort which the US succeeded in, along with their partner Osama bin Laden.
This being the case, I am not exactly sure when bin Laden and people like him became extremists. I suppose it was after the US began it's military occupation of Saudi Arabia. Osama bin Laden opposed the US military occupation of his country.
This may sound equivocal about bin Laden, but the US is more equivocal about bin Laden. I think he never should have been armed by the US. People of a like mind said as much then. Others disagreed.
In other words, if the US is making political errors (or is not making errors and is pursuing negative goals), more power and capability to carry out those erroneous policies will not help matters.
For example, Trump just escalated the conflict in the Middle East this week, which only satisfies religious fundamentalists. Handing him more power to do so will not help things, it will just mean more 9/11s in response to the blow he just landed against Arabs/Muslims.
unknown|7 years ago
[deleted]
pera|7 years ago
I am not sure why some people instantly assumes that the whole purpose of making drones more autonomous is to make them more precise: historically, the DoD/US military have made virtually zero efforts to even try to reduce civilian "casualties".
Maybe I am too cynical, but I genuinely think that the military is only willing to invest in technology that would help them expand current and future operations, disregarding the impact that these will have in the civilian population of foreign territories... probably because it's orders of magnitude cheaper to just pay someone to write a public statement denying every statistics published by neutral NGOs around the world.
flyinglizard|7 years ago
I totally get the objection to developing combative AI - that’s a separate ethical question - but you can contribute to the military and still maintain your humane values.
titzer|7 years ago
I say fuck that.
djhworld|7 years ago
I'm trying to find on their website about this principle and it doesn't seem to be there in their "values" section
https://www.google.com/about/
Did they take it out?
option2|7 years ago
If you are at G and thinking whether you should resign or not, remembers this - the market for AI talent is super hot. You will immediately find lots of great and challenging AI work pushing humanity forward
thirduncle|7 years ago
They'll find jobs quickly, that's for sure.
But "work pushing humanity forward"? There's precious little of that in any skill sector - not at FAANG salary levels, anyways. The vast bulk of the work that the vast majority of us do is simply about pushing the investor's balance sheets forward - not "humanity".
rm_-rf_slash|7 years ago
There is a school of thought that recommends “moral” people doing “immoral” work because if those people left then other “immoral” people will take those jobs and more readily implement “immoral” features. So the “moral” engineers have an incentive to stay and act as a front line against “immoral” actions, or at least have an insider’s position for whistleblowing.
Military drones are here to stay, and whether or not the US builds them, other military powers certainly will.
Ultimately, I don’t think this changes anything.
oh_sigh|7 years ago
Only a small percent of engineers at google are AI experts. Sure, more people use it, but they probably just make a service call and get some magic results back.
flurdy|7 years ago
hacker_9|7 years ago
agitator|7 years ago
Lionsion|7 years ago
As far as I can tell, this is the original: https://www.youtube.com/watch?v=9CO6M2HsoIA
carlosrg|7 years ago
Thinking that the military of your own country is "evil" seems a bit puerile to me. Watching too many movies and TV shows can have that effect.
peterwwillis|7 years ago
"Every independent investigation of the strikes has found far more civilian casualties than administration officials admit. Gradually, it has become clear that when operators in Nevada fire missiles into remote tribal territories on the other side of the world, they often do not know who they are killing, but are making an imperfect best guess." [1]
"Leaked military documents reveal that the vast majority of people killed have not been the intended targets, with approximately 13% of deaths being the intended targets, 81% being other "militants", and 6% being civilians." [2]
"strikes have killed 3,852 people, 476 of them civilians. But those counts, based on news accounts and some on-the-ground interviews, are considered very rough estimates" [1]
Not only that, we bomb inside of countries that are (sort of?) our allies, without informing them and without their consent:
"Pakistan's Prime Minister, Nawaz Sharif, has repeatedly demanded an end to the strikes, stating: "The use of drones is not only a continual violation of our territorial integrity but also detrimental to our resolve and efforts at eliminating terrorism from our country" [2]
[1] https://www.nytimes.com/2015/04/24/world/asia/drone-strikes-... [2] https://en.wikipedia.org/wiki/Drone_strikes_in_Pakistan
Yeah, I have no problem calling our military evil.
istjohn|7 years ago
Arguably, it is precisely the inability to consider international conflict from a vantage point outside a nationalist worldview that is the root cause of all external wars.
ggg9990|7 years ago
vkou|7 years ago
The US military (As have all other militaries that have done anything of note) has done some ridiculously evil things. The drone strike program isn't the worst of them by any means, but it's not its brightest moment, either.
option_greek|7 years ago
sincerely|7 years ago
dragonwriter|7 years ago
How true this is in a specific case depends on what that country’s military is doing at the time; categorically dismissing the idea that a nation's military can reasonably be seen as evil by a citizen seems more than a bit naive to me.
DINKDINK|7 years ago
peterwwillis|7 years ago
wilsonnb|7 years ago
At the very least, most people agree that killing in self defense is not evil.
ythn|7 years ago
walshemj|7 years ago
saalweachter|7 years ago
davesque|7 years ago
oh_sigh|7 years ago
archagon|7 years ago
The GPL has already shown us that a license has the power to change culture and behavior (in however small a way). We should be able to extend this approach to other values we hold dear.
raverbashing|7 years ago
neves|7 years ago
dfsegoat|7 years ago
... This is a pandoras box that has already been opened I am afraid.
mr_rheee|7 years ago
But as I sit here and think about it, I wonder if its a good thing that a person like myself (that believes I'm on the ethical high-ground) decline these types of jobs.
Someone is going to take the job. Perhaps someone less skilled than myself, perhaps someone less ethical than myself ? What is the result of that ?
As another poster wrote, it's "good" that the targetting gets more precise, meaning less collateral damage.
But to each his own. We need to be able to sleep at night aswell. And that to me also seems like a really good reason to decline.
I'm kind of on the fence about wanting to work in that industry.
typomatic|7 years ago
Whenever I find myself in this line of thought, I always remember: I am not that special. The impact that anyone has on a workplace is small and mostly inconsequential--more important is group momentum and culture. The likelihood that you'd do bad in a bad workplace is much higher than that you'd be able to stand fast and do good--the world just does not work that way.
obelix_|7 years ago
djhworld|7 years ago
unknown|7 years ago
[deleted]
kelukelugames|7 years ago
throwaway6497|7 years ago
I would be curious to see the list of employees who quit for this purpose. They are making a statement. They might as well publicly disclose their identities to inspire more people.
Also, wondering. Are most of them financially independent to have made this decision? When money is not an worry, people have freedom to truly align themselves externally with their internal core values. If you are constantly worried paying rent or securing your kids future - people make compromises. That is not ideal to build a great society.
cobookman|7 years ago
21|7 years ago
Do you think USA should have no military?
Would you vote to dismantle it?
ythn|7 years ago
Of course, there would be no conscientious objectors if it were a liberal event
rainbowmverse|7 years ago