I don't like that this is the case, but you understand that a pretty huge fraction of the country doesn't share your set of political premises that providing data for immigration enforcement is unethical, right? (I do, but that shouldn't matter for the analysis.)
It seems weird to me to hyperfocus on Flock's role here rather than the role your own local municipalities play in deciding how to configure these things. Not sharing with ICE is apparently quite doable? At least to the point of requiring a court order to get access to the data, which is a vulnerability all online cameras share.
As the CEO of Flock, don't you feel you have more information to offer this community outside of the "we do not sell data" statement you've made over and over? The fact that you do not engage here in the ethical aspects of your product doesn't look good for you and only deepens suspicion that something darker is going on behind your doors.
This is part of the problem with Flock, IMO. Lack of adherence to or support of norms. Psychopathy actualized as a corporation.
The societal impact of disruption of trust, of personal privacy, is under-appreciated by the corporation. It's concerned with winning profit.
(Meta) It's an inspecific argument I'm lazily laying out, yes, however the problem is ridiculously obvious.
We should not have to ask to be respected, and here we are.
Democratic decline (both the systems and participation in), truth, self respect/understanding of one's own rights ... those qualities are dying at the relentless toxic, ethically under-explored capitalization of our laws and resources. (Especially USA, compare to corporate social responsibility countries, I suspect)
Tech disruption is amazing to watch, and participate in, like a fire consuming the forest. "But what about the children?"
Seems like a broad dismissal of the claim made upthread ("Flock sells its data to ICE and law enforcement"). Why do you think it is excessively specific?
CharlesW|3 months ago
https://www.aclu.org/news/privacy-technology/flock-massachus...
https://www.404media.co/ice-taps-into-nationwide-ai-enabled-...
https://www.aclu.org/news/national-security/surveillance-com...
tptacek|3 months ago
It seems weird to me to hyperfocus on Flock's role here rather than the role your own local municipalities play in deciding how to configure these things. Not sharing with ICE is apparently quite doable? At least to the point of requiring a court order to get access to the data, which is a vulnerability all online cameras share.
Later
s/company/country, thanks for the correction!
potato3732842|3 months ago
I don't think anyone with a network like that can not "give" the contents to the feds for very long without drawing ire.
y-c-o-m-b|3 months ago
unknown|3 months ago
[deleted]
garrettlangley|3 months ago
[deleted]
unknown|3 months ago
[deleted]
ncr100|3 months ago
This is part of the problem with Flock, IMO. Lack of adherence to or support of norms. Psychopathy actualized as a corporation.
The societal impact of disruption of trust, of personal privacy, is under-appreciated by the corporation. It's concerned with winning profit.
(Meta) It's an inspecific argument I'm lazily laying out, yes, however the problem is ridiculously obvious.
We should not have to ask to be respected, and here we are.
Democratic decline (both the systems and participation in), truth, self respect/understanding of one's own rights ... those qualities are dying at the relentless toxic, ethically under-explored capitalization of our laws and resources. (Especially USA, compare to corporate social responsibility countries, I suspect)
Tech disruption is amazing to watch, and participate in, like a fire consuming the forest. "But what about the children?"
notrealyme123|3 months ago
loeg|3 months ago
throwaway101012|3 months ago
[deleted]