> "Palantir software is instrumental to the operations of ICE, which is planning one of the largest-ever targeted immigration enforcement raids this weekend on thousands of undocumented families. Activists argue raids of this scale would be impossible without software like Palantir."
Once and for all, it is the policy makers, not the tech industry, who are responsible for these operations. Tech enables people doing things easily, which can be good or evil. It is people, who should decide, be concern and push the lawmakers towards the right/justified path.
I used to think the same. I now think that this reasoning is faulty.
Cars can be used for good or evil. For the most part we leave it up to the users to do the right thing, however we don't just leave it up to the users. If a modern car manufacturer were to make a vehicle with no safety features whatsoever, we would rightly call them out as a bad actor. Seat belts, air bags, bumpers, crumple zones, defoggers, windshield wipers, and on and on. We have laws that require a basic level of safety for these potentially dangerous machines.
There is no reason we shouldn't expect tech and software firms to include basic "safety" considerations of their own.
Remember all those companies that helped implement censorship and spying systems for authoritarian regimes? Still thing it should up to "the lawmakers"?
I'm an American citizen who is happy to see the law enforced. I don't live in Atherton or Palo Alto. There were two Central American gang-related shootings near my family's home last week; there were two MS-13 murders in the broader area earlier this year. I grew up here, and it didn't used to be like this. I want to see those who come here illegally, or who have been denied refugee status as the result of due process, deported.
I don't care what race, nationality, or religion they are. I also want to see families and young kids kept together, and treated humanely. But at the end of the day, we [the U.S.] are a nation of laws, or we are not. I'm proud of Palantir's efforts to help enforce the (democratically-enacted, internationally-conventional) law of the land.
I don't accept this. This is shirking responsibility for what one has brought into the world. Technology has a moral component that can't be ignored in the name of progressive idealism.
Remember that in Nuremberg trials a prevailing opinion was that merely obeying illegal orders can be criminal.
Similarly if the technology being built can be reasonably used for great harms - and one building it is supposed to differentiate between 'knife' and 'Death Star'-level of potential threat - building the technology by itself may become the responsibility.
Once and for all, it is the policy makers, not the tech industry, who are responsible for these operations.
Prefacing your opinion with 'once and for all' is not much better than saying 'mine is the only valid opinion' and doesn't make for a good discussion.
Policy makers certainly create the demand for such commercial services, but it is a choice to supply that demand or not, and those who find such demands morally questionable have the capability to frustrate them by refusing to participate or obstructing the supply.
Individuals can of course abdicate the moral decision-making if it makes them uncomfortable, but acting without regard to the good or ill of the outcome is an implicit endorsement of it. If people later decide that certain outcomes are a moral ill, indifferent participants may find themselves considered culpable.
I'm sorry but your response is not based at all in historical fact. A huge part of forcing a governments hand to do the right thing is having corporations and other businesses not work with them (one example is the fall of apartheid in south africa, among other things).
Bottom up direct action is the only real way to fight injustice, law makers move far too slowly to effectively be a barrier to all injustice. Additionally, building a weapon is not simply building a tool (note that I'm not saying weapons aren't ever necessary), and it's foolish to suggest otherwise.
I don't think it makes a lot of sense to say the tech industry isn't responsible. At least as engineers I think there should be some ethical considerations.
I think there is a difference between a truly neutral tool being used for harmful purposes, and a tool specifically made for harmful purposes.
Someone in the tech industry decides to build such product for the lawmakers. These people in the tech industry know how this software is being used and they continue supporting it.
Yes, lawmakers are responsible for it, too. But let's not justify the "we just built and sold a weapon, we didn't fire it" mentality.
While yes the final decision lies with someone else, I am very uncomfortable with this idea that the creators of these tools are completely blameless.
Tools are abused all the time, but not every tool is as destructive and likely to be abused. The use and abuse should be considered, and mitigation undertaken as part of any basic engineering ethics standard.
Tools aren’t natural, and certainly not these tools. They were made very intentionally. To use an extreme example, you don’t get to make the proverbial genetically targeted bioweapon, and the throw up your hands when it’s used and say, “Hey I’m blameless. I just made the thing.” Well, the outcome of its use, and it’s appeal to bad actors is obvious, it wouldn’t have existed if you didn’t build it.
But let’s look at Palantir in particular. It’s not just software, it also a consultancy via its “forward deployed engineers”. This means that it’s not just a value neutral tool, but actively aiding customers to do whatever.
>Once and for all, it is the policy makers, not the tech industry
This is what people working on horrible things say to themselves to justify collecting a (giant) paycheck. Many parties are at fault, whether they write the laws or the software.
Seriously want to understand: what exactly is wrong with equipping cops with better tech to catch people who blatantly violate laws? I am not talking about tech that itself violates constitution/laws but rather the tech that works within the boundary of constitution/laws while helping cops do their job better.
> it is the policy makers, not the tech industry, who are responsible for these operations.
Organizationally responsible. Ethically, if you provide means to do X and can reasonably predict that someone (e.g. government) will apply X to do something unethical, then you are at least partially responsible for the final unethical result.
I hired a former CIA analyst a few years ago. Part of his job at the CIA was to contribute to the daily briefings provided to president Obama. I asked him how he felt about the ethics of spying on American citizens. He said I wouldn't believe the number of threats that were prevented on a daily basis because of the tools and data they have access to. He said they don't share the information about what or how many crimes they prevent because it would cause panic. They also don't share that information because it would give away how they were uncovering the plans.
It seemed like a viable explanation and I don't think he had any incentive to embellish...but it also seemed like something big brother would say to get me to give up freedoms.
Keep in mind that internally lots of stats/etc are posted around things like cyber attacks, where a 'port scan' is counted as 65k different attacks (all of the tcp ports getting a 'malicious' SYN == an attack).
It's the functional equivalent of calling satellite overflights attacks, every time they fly by.
'We are under constant attack!' also helps the internal justification of working for/with agencies like that.
That does make sense, but the problem is it's not provable. Citizens just have to take the gov'ts word that all of this spying is keeping us safe. That's just ripe for abuse.
And also keep in mind that the people "preventing" these threats have a vested interested in making these "threats" sound as sinister as possible. It's similar to drug busts by cops - if they seize a kilo of cocaine, they don't say they took "$10,000 worth of drugs off the street", they say they took "$200,000 worth of drugs off the street", by making the most favorable calculation (selling individual grams at an inflated price).
In communities like HN there is always been a belief that compromising privacy is not actually required to keep people safe. Whilst I generally agree with this, I don't think it is set in stone. The threat could evolve to the point where this kind of tech is the only realistic method of detection at scale. That would be a scary world to live in. We would loose our privacy and eventually our security as attackers evolve. This could happen, but probably won't because terrorists are still very limited in numbers and talent.
Would this be all that shocking to the general public? Watch any detective show and the fictional tools there are at least as capable as this real life tool.
Not any more than it should be shocking to a forum of computer scientists that police agencies have tools to query a database and produce visualizations.
One way to explore that question would be to make a game to expose people to these ethical questions. I am thinking like "Papers Please" but recasted as ICE gathering up illegals and their families.
It blows my mind that Google and Facebook have been demonized by the wider media as "surveillance capitalism" while Palantir, who for all intents and purposes is literally a surveillance tech giant, has largely escaped public consciousness and criticism.
The tool tracks where you have been, your bank account, etc. So while it may not tell what guns were purchased, it can tell if they have visted a shooting range.
I almost guarantee that these engineers have a very different value set than you. They could possibly see this as granting law enforcement legitimate and authorized access to collect evidence needed to prosecute real criminals. In their minds abuses of the technology they make is likely the illegal part not the existence of their tool itself.
Most of the responses I'm seeing here are about money and while that is probably a factor at some level, I like to believe that people need to have a lot of pressure put on them by society for other reasons to give up their personal morals in exchange for money.
What leads well-off Americans, whose safety and comfort is guaranteed by the constant efforts of law enforcement, to conclude that legal efforts to make law enforcement more efficient would make the world a worse place?
$, like someone else has said, but I also would guess that many of the engineers do not have a clear idea of the overarching functionality of the application.
I'm sure internal communication about the product is extremely positive, with phrases like "improved law enforcement accuracy by XX%, decreased customer costs and time by YY," so some may truly believe they are doing something that everyone would agree is good.
Would you rather tech companies not be trying to help law enforcement or our military, and just leave them to their own devices?
I have my answer, but I don't think it's nearly this black and white. Palantir probably does a lot of good, just like all the big tech companies, even though it's the bad stuff that mostly makes the news.
Why a worse place? In this particular example, Palantir is used by the cops to run investigations; it is just a more convenient user interface and analytical engine to already existing data sources. I don’t see anything particularly unethical here.
I'm sure if you had a shred of empathy in your whole body, you could think for about five seconds and realize that maybe not everyone else had the same system of values as you.
[+] [-] strooper|6 years ago|reply
Once and for all, it is the policy makers, not the tech industry, who are responsible for these operations. Tech enables people doing things easily, which can be good or evil. It is people, who should decide, be concern and push the lawmakers towards the right/justified path.
[+] [-] metalliqaz|6 years ago|reply
Cars can be used for good or evil. For the most part we leave it up to the users to do the right thing, however we don't just leave it up to the users. If a modern car manufacturer were to make a vehicle with no safety features whatsoever, we would rightly call them out as a bad actor. Seat belts, air bags, bumpers, crumple zones, defoggers, windshield wipers, and on and on. We have laws that require a basic level of safety for these potentially dangerous machines.
There is no reason we shouldn't expect tech and software firms to include basic "safety" considerations of their own.
Remember all those companies that helped implement censorship and spying systems for authoritarian regimes? Still thing it should up to "the lawmakers"?
[+] [-] _iyig|6 years ago|reply
I don't care what race, nationality, or religion they are. I also want to see families and young kids kept together, and treated humanely. But at the end of the day, we [the U.S.] are a nation of laws, or we are not. I'm proud of Palantir's efforts to help enforce the (democratically-enacted, internationally-conventional) law of the land.
[+] [-] overthemoon|6 years ago|reply
It's both/and.
[+] [-] avmich|6 years ago|reply
Similarly if the technology being built can be reasonably used for great harms - and one building it is supposed to differentiate between 'knife' and 'Death Star'-level of potential threat - building the technology by itself may become the responsibility.
[+] [-] anigbrowl|6 years ago|reply
Prefacing your opinion with 'once and for all' is not much better than saying 'mine is the only valid opinion' and doesn't make for a good discussion.
Policy makers certainly create the demand for such commercial services, but it is a choice to supply that demand or not, and those who find such demands morally questionable have the capability to frustrate them by refusing to participate or obstructing the supply.
Individuals can of course abdicate the moral decision-making if it makes them uncomfortable, but acting without regard to the good or ill of the outcome is an implicit endorsement of it. If people later decide that certain outcomes are a moral ill, indifferent participants may find themselves considered culpable.
[+] [-] zjaffee|6 years ago|reply
Bottom up direct action is the only real way to fight injustice, law makers move far too slowly to effectively be a barrier to all injustice. Additionally, building a weapon is not simply building a tool (note that I'm not saying weapons aren't ever necessary), and it's foolish to suggest otherwise.
[+] [-] 49531|6 years ago|reply
I think there is a difference between a truly neutral tool being used for harmful purposes, and a tool specifically made for harmful purposes.
[+] [-] arnvald|6 years ago|reply
Yes, lawmakers are responsible for it, too. But let's not justify the "we just built and sold a weapon, we didn't fire it" mentality.
[+] [-] wmf|6 years ago|reply
[+] [-] jonathankoren|6 years ago|reply
Tools are abused all the time, but not every tool is as destructive and likely to be abused. The use and abuse should be considered, and mitigation undertaken as part of any basic engineering ethics standard.
Tools aren’t natural, and certainly not these tools. They were made very intentionally. To use an extreme example, you don’t get to make the proverbial genetically targeted bioweapon, and the throw up your hands when it’s used and say, “Hey I’m blameless. I just made the thing.” Well, the outcome of its use, and it’s appeal to bad actors is obvious, it wouldn’t have existed if you didn’t build it.
But let’s look at Palantir in particular. It’s not just software, it also a consultancy via its “forward deployed engineers”. This means that it’s not just a value neutral tool, but actively aiding customers to do whatever.
[+] [-] navigatesol|6 years ago|reply
This is what people working on horrible things say to themselves to justify collecting a (giant) paycheck. Many parties are at fault, whether they write the laws or the software.
[+] [-] telltruth|6 years ago|reply
[+] [-] praptak|6 years ago|reply
Organizationally responsible. Ethically, if you provide means to do X and can reasonably predict that someone (e.g. government) will apply X to do something unethical, then you are at least partially responsible for the final unethical result.
[+] [-] ajna91|6 years ago|reply
[+] [-] jvagner|6 years ago|reply
[+] [-] rsweeney21|6 years ago|reply
I hired a former CIA analyst a few years ago. Part of his job at the CIA was to contribute to the daily briefings provided to president Obama. I asked him how he felt about the ethics of spying on American citizens. He said I wouldn't believe the number of threats that were prevented on a daily basis because of the tools and data they have access to. He said they don't share the information about what or how many crimes they prevent because it would cause panic. They also don't share that information because it would give away how they were uncovering the plans.
It seemed like a viable explanation and I don't think he had any incentive to embellish...but it also seemed like something big brother would say to get me to give up freedoms.
[+] [-] lisper|6 years ago|reply
Or to keep up morale at the agency.
The one indisputable fact is that actual attacks are very rare. So there are only two possibilities:
1. Law enforcement is exceptionally effective at stopping them, or
2. There are very few attempted attacks to begin with.
Here's a data point to help you decide which of these is more likely:
https://abcnews.go.com/US/tsa-fails-tests-latest-undercover-...
[+] [-] count|6 years ago|reply
It's the functional equivalent of calling satellite overflights attacks, every time they fly by.
'We are under constant attack!' also helps the internal justification of working for/with agencies like that.
[+] [-] refurb|6 years ago|reply
And also keep in mind that the people "preventing" these threats have a vested interested in making these "threats" sound as sinister as possible. It's similar to drug busts by cops - if they seize a kilo of cocaine, they don't say they took "$10,000 worth of drugs off the street", they say they took "$200,000 worth of drugs off the street", by making the most favorable calculation (selling individual grams at an inflated price).
[+] [-] 7952|6 years ago|reply
[+] [-] deogeo|6 years ago|reply
[+] [-] evan_|6 years ago|reply
You always hear people say this, but they don't seem to worry about the panic that line might cause.
[+] [-] ataturk|6 years ago|reply
[deleted]
[+] [-] ummonk|6 years ago|reply
[+] [-] ng12|6 years ago|reply
[+] [-] ericzawo|6 years ago|reply
[+] [-] zbyte64|6 years ago|reply
[+] [-] spraak|6 years ago|reply
[+] [-] ng12|6 years ago|reply
[+] [-] dickeytk|6 years ago|reply
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] gundmc|6 years ago|reply
[+] [-] spinlock|6 years ago|reply
[+] [-] Doches|6 years ago|reply
[+] [-] manfredo|6 years ago|reply
[+] [-] zbyte64|6 years ago|reply
[+] [-] chance_state|6 years ago|reply
[+] [-] TrueDuality|6 years ago|reply
Most of the responses I'm seeing here are about money and while that is probably a factor at some level, I like to believe that people need to have a lot of pressure put on them by society for other reasons to give up their personal morals in exchange for money.
"No one is the villain in their own story".
[+] [-] _iyig|6 years ago|reply
[+] [-] baobrain|6 years ago|reply
I'm sure internal communication about the product is extremely positive, with phrases like "improved law enforcement accuracy by XX%, decreased customer costs and time by YY," so some may truly believe they are doing something that everyone would agree is good.
[+] [-] leesec|6 years ago|reply
I have my answer, but I don't think it's nearly this black and white. Palantir probably does a lot of good, just like all the big tech companies, even though it's the bad stuff that mostly makes the news.
[+] [-] 49531|6 years ago|reply
[+] [-] unknown|6 years ago|reply
[deleted]
[+] [-] atemerev|6 years ago|reply
[+] [-] subjectsigma|6 years ago|reply
[+] [-] sansnomme|6 years ago|reply
[+] [-] BowBun|6 years ago|reply
[+] [-] zymhan|6 years ago|reply
[+] [-] akusete|6 years ago|reply