top | item 40218935

(no title)

robot_no_421 | 1 year ago

“There has long been a concern [facial recognition] could invade upon people's lives through expanded surveillance and through the criminalization of just existing within the public sphere,” Mamdani said.

Except, nobody is calling the regular fare paying people criminals. Just the people who aren't paying the fare and breaking the rules.

discuss

order

LightHugger|1 year ago

I feel like dismissals like this always lack imagination, you can't think of any uses for facial recognition surveillance that pegs you at specific locations at specific times? You can't imagine a single way how to monetize that data and use it against you or other law abiding citizens?

People said the same thing for social media data collection and now first degree price discrimination is getting more common. It's a total lie that it's only "people who are breaking the rules" suffer consequences with tracking and surveillance.

Justsignedup|1 year ago

Just to note, everything is rosy right up until another MTA profit conversation and suddenly how can we monetize literally everything.

Also facial recognition is error prone enough that I'm sure a few people will get harassed by the system constantly flagging them. When they're not evading.

And evaders might wear a reflective hat or something making them invisible to the cameras.

robot_no_421|1 year ago

> "you can't think of any uses for facial recognition surveillance that pegs you at specific locations at specific times? You can't imagine a single way how to monetize that data and use it against you or other law abiding citizens?"

Having imagination has nothing to do with staying on track and sticking to the conversation topic. Nothing you said has anything to do with criminalizing law abiding citizen for just existing. Privacy rights are a related but separate topic. We could talk about that separately if you want to. But I don't see how AI knowing where we are, or how AI enabling the monetization of our data has anything to do with criminalizing anything I do on a daily basis.

add-sub-mul-div|1 year ago

And the people falsely identified as criminals by inevitably faulty AI that no one can directly fix bugs with.

robot_no_421|1 year ago

Right, but you could theoretically say this about any software or technology (and this argument has very frequently in the past been used to argue against technologies like cars and airplanes). DNA sampling has "flaws" and "bugs" in it too that occasionally lead to false positives. Even police officers and lawyers falsely identify people occasionally. AI technology would just be another tool we collect evidence with for use in making inferences about the world. Imperfect technology never has been a blocker for using or improving that technology until it meets its purposes.