>"Google’s Eric Schmidt summed up the tech industry’s concerns about collaborating with the Pentagon at a talk last fall. “There’s a general concern in the tech community of somehow the military-industrial complex using their stuff to kill people incorrectly,”
I sure am surprised that the discussion has already reached the point where tech companies are debating whether they kill people 'incorrectly'. I must have missed the democratic discussion about private businesses assisting in killing people at all, a duty traditionally exercised by states.
> I sure am surprised that the discussion has already reached the point where tech companies are debating whether they kill people 'incorrectly'. I must have missed the democratic discussion about private businesses assisting in killing people at all, a duty traditionally exercised by states.
Sure, it's valid to disagree with Google's involvement in this project, but I find it incredibly hard to justify the claim that private enterprises traditionally don't equip armed forces. Providing technology and hardware to armed forces is something that private companies have been doing for centuries. Heck non-state groups selling arms to armed forces likely predates the existence of states as we know them. Most armies prior to the modern era were militias that privately purchased their own equipment.
This project is "assisting in killing people" to at most the same degree that developing munitions guidance, military radar systems, and sonar are "assisting in killing people" (I'd argue less since this technology is purely about reviewing reconnaissance data, not about the actual deployment of weapons) and private companies have been developing those systems for over a century.
Literally all of the equipment used to kill people, from the rifles, to the bullets, to the fighter jets, to the parts to repair the fighter jets, are manufactured by private companies in the United States. So I’m confused by your confusion in thinking that Google’s assistance in the drone program is somehow the first private company to participate in providing goods and services that are directly involved in killing people. Is Google somehow more culpable than the companies that make the bullets or bombs? How is providing software and computing power for that any different? You don’t think a private company writes the software that targets things for jets and missiles?
> I must have missed the democratic discussion about private businesses assisting in killing people at all, a duty traditionally exercised by states.
States have contracted out to private individuals and firms for the development of tools for that purpose (and the actual killing) for as long as states as distinct from private individuals exercising power have existed.
To me that sounds like a misreading of the statement. It makes more sense to interpret it to the effect that tech companies are worried that "the military-industrial complex is using their stuff to kill people" and that this is incorrect. Google doesn't really specialize in optimizing lethality, but it does make a lot of noise about ethical application of its tech.
When people refer to the military industrial complex, they are referring to the businesses like Lockheed Martin, Raytheon, etc. that are the private companies involved in designing and building weapons.
> I must have missed the democratic discussion about private businesses assisting in killing people at all, a duty traditionally exercised by states.
You missed the "using their stuff to" part of that quote, the opportunities for melodrama here are fewer than you might've hoped.
In some way every service you could offer anyone is part of the war machine. When you operate a daycare, you're taking care of somebody's kids, and that somebody either does something for, or is involved with the military. One of the actions typically carried out by an active military is killing people.
Somebody is always using the products of your labour, in some fashion or level of abstraction, to kill people in whatever way their outfit, organization, or mandate deems appropriate.
And if we're to be more honest about what's going on here, you could just as easily say that Google is helping the U.S. government not kill people they never intended to kill. Also, I think it's rather reasonable to assume that the U.S. government's interest in continuing the drone program, or the military operations it replaces, is largely independent of Google's willingness to help them with targeting, which to me indicates that Google has something of a moral obligation to help make that happen if they're called upon to do so.
> I must have missed the democratic discussion about private businesses assisting in killing people at all, a duty traditionally exercised by states.
I agree with you, but I have to point out that nearly all of our wars are motivated to make the world safe for American business. The killing fields run red so that we can mine the green shoots. The last time we voted on military action was right after 9/11, where we gave permission for an unending world-wide war and toppled two countries outright and publicly.
Many defense contractors made a killing off these shenanigans.
Paradoxically, Google getting involved has the potential, if not the actual goal, of decreasing the amount of mistaken or misidentified targets.
I understand some "conscientious" Googlers feeling uneasy about Google getting involved in this sort of thing, but paradoxically, Google (or other AI resources) getting involved will very likely result in fewer incidents of hitting the wrong targets. In other words a measurable reduction in "collateral casualties/damage"
> I must have missed the democratic discussion about private businesses assisting in killing people at all, a duty traditionally exercised by states.
Are you from US? You might not be. Sounds like are not aware of the multi-billion dollar (maybe even trillion dollar) military industrial complex.
Also not sure what is meant by "democratic discussion". Who are the constituents here? Everyone, employees, tech people only, HN users?
> kill people 'incorrectly'.
Right. I think that tells us where Google stands. It is already helping kill people, it just helps kill them "correctly" of course. In some kind of a nice, non-evil way presumably
>I must have missed the democratic discussion about private businesses assisting in killing people at all, a duty traditionally exercised by states.
I'm not sure if you're criticizing that this happens at all or if you're criticizing it as a new development. Either way, it's generally been actual killing that's reserved for the state (or its mercenaries). Private organizations building and refining weapons for the state has been a thing in America since the old west.
We faced this at work recently (at our decidely sub-Google scale) when sales guy refused to bid for an opportunity at a weapons manufacturer so we had some interesting discussions around the issue.
Its a little hard to make blanket statements that weapons/warfare are bad. There are good times to use weapons.
An obvious one was at the time of WWII. If the clever people had refused to work on weapons, things would have finished up potentially a lot worse for mankind generally.
And perhaps in our medium term future, as climate change becomes more and more real, a critical mass of people will decry the continued burning of fossil fuels. And if retrograde nations continue to poison our common resource, then maybe some global police force will need weapons to stop them.
> Its a little hard to make blanket statements that weapons/warfare are bad.
In the case of the US it's usually the truth. The last 3 major US wars were not fighting Nazis; the last 3 major US wars were mostly imperialistic BS; the last 3 major US wars resulted in 20-30 million killed in 37 nations[1].
Or put another way, just because violent predator X(with a long history of unjustly attacking others) managed to take down another worse violent predator Y doesn't mean you should continue to arm and support violent predator X.
> An obvious one was at the time of WWII. If the clever people had refused to work on weapons, things would have finished up potentially a lot worse for mankind generally.
There is still debate about whether the use of nukes was at all justifiable, a lot of the motivation for researchers was to stop the Nazis from getting there first, not beating Japan into submission.
That said, this isn't WW2. The US has the largest military on Earth and the most advanced weapon systems. US drone strikes are killing targets in civilian areas without repercussions. The POTUS has the authorisation to wage war against whomever he chooses as long as he can somehow relate them to someone who was involved in 9/11. Heck, the invasion of Iraq even violated international law with no repercussions.
The US military (CIA included) can pretty much do what it pleases and kill people whenever and wherever it wants. Don't want to kill right now? Off to Guantanamo they go. If they happen to be a US citizen just say they're an enemy combatant and make sure they never see a domestic court of law. Even torture is permissible.
If you're worried the US might not be sufficiently equipped to become a dictatorial global authority, you haven't been paying attention.
Mad respect to sales guy. I'm doing the same and avoiding arms companies (my background is very useful for military research).
I acknowledge some warfare may be legitimate (having been bombed myself), but arms companies don't stop at selling to your personal favorite army which you consider morally right, they keep looking for more business abroad.
I don't want to be the one realizing I'm sitting in a cozy air-conditioned office and having made money from the messed up warfare in some distant far-away country, having a large financial incentive to cause more conflict there.
There's also an argument to be made that remaining on the cutting edge of military science is a form of national security. Being one of the first, or only, nations to develop and test a new kind of weapon means that you'll also be among the first nations to be able to fully assess its practical viability and evaluate its countermeasures.
Ethically, I wonder if this would be any different if Google built a similar software for consumer applications and licensed it similarly to Android being used for surveillance in both the US and far more paranoid governments.
I would imagine that this allows both Google and the US government to have far more insight and control over the direction of this and possible applications.
Not to dismiss the obvious ethical issues of Google having possibly harmful incentives and having their hands tied by the US government financially and legally, but all things considered, I think the main difference here is that Google is already doing this on a massively large scale for the sake of selling ads. It is possible this can help save lives.
Please don't fall for this rhetoric. The technology helps killing people, not save lives. Saving lives means preventing people from being killed. You can't prove any of the drone strikes ever helped preventing people from being killed. You can prove a lot of people (including people most normal human beings would consider civilians) did get killed. And even if killing the target helped save lives you can't prove those lives couldn't have been saved any other way.
Drone strikes don't save lives. Drone strikes take lives. The reason we use to justify taking those lives is that they might help save other lives. But mostly drone strikes are trading the guaranteed death of foreigners for the possibility of saving American lives.
The word used is “outraged” and not “surprised” though. I think that’s more than just a semantic point. In fact there is no mention of surprise in the article. I think that’s a reasonable reaction, whereas surprise is a straw man, largely implying a naïveté which is notably absent in the article.
As a US citizen concerned with our bloated military budget but who also wants the US to remain the strongest military nation on Earth, I'd like to cut $100 billion from the $600 billion plus yearly US military budget, and allocate that money to infrastructure and social programs. We would still have by far the largest military budget in the world after this.
Whoever knows about the US military budget, how feasible would this be? What is the bulk of the military budget dedicated to?
The biggest fractions are personnel, operations, and health care (last I checked, the Defense Health Agency spends more than the Marine Corps ... and provides health care for 8.4 million Americans. The ones who are that strongest military). But the DoD budget actually pales in comparison to Medicare and Social Security. The US Government is a well-armed insurance company, not a weapons dealer selling health insurance.
You could cut a number of non-essential items, even if they're not the largest - the US Army is the largest employer of musicians in the country [0], and the entire military spent at least $437M in one year on musicians in 2015 [1].
It worked for the UK and the British East India company , it should work fine for America. Tesla can set up mining camps on Mars and eliminate ten percent of it's "underperforming" population every year, just like it does at it's factory.
Google's new cloud offering "flower express" you just type someones personal id, and then the Google database looks up last known position and sends a drone.
Let's be completely speculative and alarmist about the potential of AI: It could be like developing nuclear weapons first; if the US gets military AI wrong, it could quickly become a poor vassal state to the world's new superpower.[0]
People are alarmist and speculative because AI's potential is unknown. If the potential of the new blockchain technology is unknown then you can wait and see what happens. But given the stakes with military AI, you can't take even tiny risks; you can't wait and see if your country will be in history books as an experiment that lasted 250 years.
By declining to help the US military, Google engineers take that risk to a degree. But if they participate then they gain enormous leverage: Given the stakes, the US military can't afford to alienate them. I'd hope they can use their leverage to achieve related goals: Agreements banning the use of this technology against civilians, foreign or domestic, and banning sharing the tech with law enforcement. Leverage Congress into passing privacy and civil rights laws protecting Americans against abuses of the technology.
[0] Note that AI changes things in another way: For all human history, military power was tied to population size. In the future, with the right AI and some underground robot factories, potentially a small country could dominate. Maybe Singapore?
Not a small country. A large transnational corp instead. Especially a one which has a hand/tentacles deep inside the world wide communication web plus large AI development plus great aspiration of not being evil ... Just imagine yourself in the position of wanting to make the world a better place and actually able to herd the world into that better place.
Where you're wrong is that military AI isn't a traditional WMD. WMDs work as a deterrent because they can destroy a large area and kill a lot of people, sure. But the reason the Cold War never went hot is that it's immediately obvious when an atomic bomb is used and (to varying degrees) who used it. Also atomic bombs are notoriously difficult to produce -- sure, states will eventually get there (see NK) but "non-state actors" can't even begin to develop the equipment to produce them.
Military AI... not so much. Sure, you might have an idea who hates you at any given moment and who has the means, but it's far fuzzier than a missile you can neatly track from point A to point B. Plus we already know how easy it is to produce cheap knockoffs.
Unlike nukes, military AI isn't a deterrent. Autonomous weapon systems aren't the new H bombs, they're the new AK47s and Toyota pickup trucks.
Yeah there’s a real risk of a blitzkrieg sort of situation where some country builds a drone army that just picks a traditional army apart on the battlefield.
Eventually someone is going to cut the cord and have truly autonomous armed vehicles and whoever does it first is going to have a tremendous tactical advantage.
[0] -> If we're talking robot armies, it will still depend on a "smaller" state's access to resources. You can't build robots out of thin air, unless you plan on hacking yourself one. But that depends on someone building one in the first place.
I am surprised that this is news. Google has done US government work for a long time. After 9/11 the newly formed homeland security wanted pictures and accurate data in regards to residential addresses in the US. At first they went to the credit bureaus and data brokers to collect and store this information due to the fact they already had researchers on the ground collecting data. Google came along with Google earth and not long after that Google street view. Think about how long it was before they where able to actually monetize google earth & maps. They perfected the art of geodata collection and was likely paid for via government grants [0] and contracts. Even today they sell access to different government agencies[1].
As long as there's two people left on the planet someone's going to want someone dead. Defense contracts will always be a good investment and it can let you get close to people in power more easily than by lobbying, no wonder Google is taking the opportunity.
So when Damore released his manifesto a lot of people wanted Google to fire him, otherwise they would quit themselves.
Where is the Google employee outrage now? I mean, this right here is bad, it has serious implications. Drone killings are the most outrageous thing the U.S. has done in a while...
[+] [-] Barrin92|8 years ago|reply
>"Google’s Eric Schmidt summed up the tech industry’s concerns about collaborating with the Pentagon at a talk last fall. “There’s a general concern in the tech community of somehow the military-industrial complex using their stuff to kill people incorrectly,”
I sure am surprised that the discussion has already reached the point where tech companies are debating whether they kill people 'incorrectly'. I must have missed the democratic discussion about private businesses assisting in killing people at all, a duty traditionally exercised by states.
[+] [-] manfredo|8 years ago|reply
Sure, it's valid to disagree with Google's involvement in this project, but I find it incredibly hard to justify the claim that private enterprises traditionally don't equip armed forces. Providing technology and hardware to armed forces is something that private companies have been doing for centuries. Heck non-state groups selling arms to armed forces likely predates the existence of states as we know them. Most armies prior to the modern era were militias that privately purchased their own equipment.
This project is "assisting in killing people" to at most the same degree that developing munitions guidance, military radar systems, and sonar are "assisting in killing people" (I'd argue less since this technology is purely about reviewing reconnaissance data, not about the actual deployment of weapons) and private companies have been developing those systems for over a century.
[+] [-] jgowdy|8 years ago|reply
[+] [-] noelsusman|8 years ago|reply
[+] [-] dragonwriter|8 years ago|reply
States have contracted out to private individuals and firms for the development of tools for that purpose (and the actual killing) for as long as states as distinct from private individuals exercising power have existed.
[+] [-] devindotcom|8 years ago|reply
[+] [-] 18pfsmt|8 years ago|reply
[+] [-] microcolonel|8 years ago|reply
You missed the "using their stuff to" part of that quote, the opportunities for melodrama here are fewer than you might've hoped.
In some way every service you could offer anyone is part of the war machine. When you operate a daycare, you're taking care of somebody's kids, and that somebody either does something for, or is involved with the military. One of the actions typically carried out by an active military is killing people.
Somebody is always using the products of your labour, in some fashion or level of abstraction, to kill people in whatever way their outfit, organization, or mandate deems appropriate.
And if we're to be more honest about what's going on here, you could just as easily say that Google is helping the U.S. government not kill people they never intended to kill. Also, I think it's rather reasonable to assume that the U.S. government's interest in continuing the drone program, or the military operations it replaces, is largely independent of Google's willingness to help them with targeting, which to me indicates that Google has something of a moral obligation to help make that happen if they're called upon to do so.
[+] [-] jadedhacker|8 years ago|reply
I agree with you, but I have to point out that nearly all of our wars are motivated to make the world safe for American business. The killing fields run red so that we can mine the green shoots. The last time we voted on military action was right after 9/11, where we gave permission for an unending world-wide war and toppled two countries outright and publicly.
Many defense contractors made a killing off these shenanigans.
EDIT: Connecting this history to now, recall https://wikileaks.org/google-is-not-what-it-seems/
[+] [-] mc32|8 years ago|reply
I understand some "conscientious" Googlers feeling uneasy about Google getting involved in this sort of thing, but paradoxically, Google (or other AI resources) getting involved will very likely result in fewer incidents of hitting the wrong targets. In other words a measurable reduction in "collateral casualties/damage"
[+] [-] username223|8 years ago|reply
"Patches welcome!"
[+] [-] PopsiclePete|8 years ago|reply
...what? This is America. Private industry makes all killing machines. Last I checked, Colt, Boeing, Lockheed Martin - all private companies.
[+] [-] creaghpatr|8 years ago|reply
[+] [-] rdtsc|8 years ago|reply
Are you from US? You might not be. Sounds like are not aware of the multi-billion dollar (maybe even trillion dollar) military industrial complex.
Also not sure what is meant by "democratic discussion". Who are the constituents here? Everyone, employees, tech people only, HN users?
> kill people 'incorrectly'.
Right. I think that tells us where Google stands. It is already helping kill people, it just helps kill them "correctly" of course. In some kind of a nice, non-evil way presumably
[+] [-] remir|8 years ago|reply
[+] [-] mankash666|8 years ago|reply
[+] [-] ycombinete|8 years ago|reply
[+] [-] EggsOnToast|8 years ago|reply
I'm not sure if you're criticizing that this happens at all or if you're criticizing it as a new development. Either way, it's generally been actual killing that's reserved for the state (or its mercenaries). Private organizations building and refining weapons for the state has been a thing in America since the old west.
[+] [-] beachy|8 years ago|reply
Its a little hard to make blanket statements that weapons/warfare are bad. There are good times to use weapons.
An obvious one was at the time of WWII. If the clever people had refused to work on weapons, things would have finished up potentially a lot worse for mankind generally.
And perhaps in our medium term future, as climate change becomes more and more real, a critical mass of people will decry the continued burning of fossil fuels. And if retrograde nations continue to poison our common resource, then maybe some global police force will need weapons to stop them.
[+] [-] rxhernandez|8 years ago|reply
In the case of the US it's usually the truth. The last 3 major US wars were not fighting Nazis; the last 3 major US wars were mostly imperialistic BS; the last 3 major US wars resulted in 20-30 million killed in 37 nations[1].
Or put another way, just because violent predator X(with a long history of unjustly attacking others) managed to take down another worse violent predator Y doesn't mean you should continue to arm and support violent predator X.
1. https://www.globalresearch.ca/us-has-killed-more-than-20-mil...
[+] [-] pluma|8 years ago|reply
There is still debate about whether the use of nukes was at all justifiable, a lot of the motivation for researchers was to stop the Nazis from getting there first, not beating Japan into submission.
That said, this isn't WW2. The US has the largest military on Earth and the most advanced weapon systems. US drone strikes are killing targets in civilian areas without repercussions. The POTUS has the authorisation to wage war against whomever he chooses as long as he can somehow relate them to someone who was involved in 9/11. Heck, the invasion of Iraq even violated international law with no repercussions.
The US military (CIA included) can pretty much do what it pleases and kill people whenever and wherever it wants. Don't want to kill right now? Off to Guantanamo they go. If they happen to be a US citizen just say they're an enemy combatant and make sure they never see a domestic court of law. Even torture is permissible.
If you're worried the US might not be sufficiently equipped to become a dictatorial global authority, you haven't been paying attention.
[+] [-] nbsd4lyfe|8 years ago|reply
I acknowledge some warfare may be legitimate (having been bombed myself), but arms companies don't stop at selling to your personal favorite army which you consider morally right, they keep looking for more business abroad.
I don't want to be the one realizing I'm sitting in a cozy air-conditioned office and having made money from the messed up warfare in some distant far-away country, having a large financial incentive to cause more conflict there.
[+] [-] EggsOnToast|8 years ago|reply
[+] [-] natecavanaugh|8 years ago|reply
Not to dismiss the obvious ethical issues of Google having possibly harmful incentives and having their hands tied by the US government financially and legally, but all things considered, I think the main difference here is that Google is already doing this on a massively large scale for the sake of selling ads. It is possible this can help save lives.
[+] [-] pluma|8 years ago|reply
Please don't fall for this rhetoric. The technology helps killing people, not save lives. Saving lives means preventing people from being killed. You can't prove any of the drone strikes ever helped preventing people from being killed. You can prove a lot of people (including people most normal human beings would consider civilians) did get killed. And even if killing the target helped save lives you can't prove those lives couldn't have been saved any other way.
Drone strikes don't save lives. Drone strikes take lives. The reason we use to justify taking those lives is that they might help save other lives. But mostly drone strikes are trading the guaranteed death of foreigners for the possibility of saving American lives.
[+] [-] booleandilemma|8 years ago|reply
Why are they surprised? Google is the Bell Labs of our time.
https://en.wikipedia.org/wiki/Top_100_Contractors_of_the_U.S...
[+] [-] IntronExon|8 years ago|reply
[+] [-] ProAm|8 years ago|reply
[+] [-] meri_dian|8 years ago|reply
Whoever knows about the US military budget, how feasible would this be? What is the bulk of the military budget dedicated to?
[+] [-] killjoywashere|8 years ago|reply
[+] [-] dschuler|8 years ago|reply
[0] https://www.npr.org/sections/therecord/2010/09/29/130212353/...
[1] https://www.politico.com/story/2016/05/pentagons-bands-battl...
[+] [-] blackrock|8 years ago|reply
[+] [-] nwrk|8 years ago|reply
Same apply to my Google Photos ?
[+] [-] cityofghosts|8 years ago|reply
[+] [-] aforty|8 years ago|reply
[+] [-] z3t4|8 years ago|reply
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] bingobob|8 years ago|reply
[+] [-] ehsankia|8 years ago|reply
This stuff already exists to some extent and can be used by anyone (within Google's terms & services).
https://cloud.google.com/video-intelligence/
[+] [-] forapurpose|8 years ago|reply
People are alarmist and speculative because AI's potential is unknown. If the potential of the new blockchain technology is unknown then you can wait and see what happens. But given the stakes with military AI, you can't take even tiny risks; you can't wait and see if your country will be in history books as an experiment that lasted 250 years.
By declining to help the US military, Google engineers take that risk to a degree. But if they participate then they gain enormous leverage: Given the stakes, the US military can't afford to alienate them. I'd hope they can use their leverage to achieve related goals: Agreements banning the use of this technology against civilians, foreign or domestic, and banning sharing the tech with law enforcement. Leverage Congress into passing privacy and civil rights laws protecting Americans against abuses of the technology.
[0] Note that AI changes things in another way: For all human history, military power was tied to population size. In the future, with the right AI and some underground robot factories, potentially a small country could dominate. Maybe Singapore?
[+] [-] trhway|8 years ago|reply
[+] [-] pluma|8 years ago|reply
Military AI... not so much. Sure, you might have an idea who hates you at any given moment and who has the means, but it's far fuzzier than a missile you can neatly track from point A to point B. Plus we already know how easy it is to produce cheap knockoffs.
Unlike nukes, military AI isn't a deterrent. Autonomous weapon systems aren't the new H bombs, they're the new AK47s and Toyota pickup trucks.
[+] [-] empath75|8 years ago|reply
Eventually someone is going to cut the cord and have truly autonomous armed vehicles and whoever does it first is going to have a tremendous tactical advantage.
[+] [-] vtange|8 years ago|reply
[+] [-] bhhaskin|8 years ago|reply
[0] https://en.wikipedia.org/wiki/Google_Earth#History
[1] https://enterprise.google.com/maps/government/
[+] [-] announcerman|8 years ago|reply
[+] [-] unknown|8 years ago|reply
[deleted]
[+] [-] majestik|8 years ago|reply
“You scratch my back, I scratch yours.”
[+] [-] dna_polymerase|8 years ago|reply
Where is the Google employee outrage now? I mean, this right here is bad, it has serious implications. Drone killings are the most outrageous thing the U.S. has done in a while...
[+] [-] IntronExon|8 years ago|reply
[deleted]