top | item 33257182

(no title)

gwittel | 3 years ago

Having worked in anti-abuse for nearly 20 years this is spot on. Even if it were possible, publishing “the algorithm” isn’t going to solve anything. It’s not like it can be published in secret or avoid being instantly obsolete.

All of this is an exercise balancing information asymmetry and cost asymmetry. We don’t want to add more friction than necessary to end users, but somehow must impose enough cost to abusers in order to keep abuse levels low.

Unfortunately for us, it generally costs far less for attackers to bypass systems than defenders to sustain a block.

As defenders we work to exploit things in our favor - signals and scale. Signals drive our systems be it ML, heuristics, signatures (or more likely a combination). Scale lets us spot larger patterns in space or time. At a cost. 99%+ effective systems are great, but at scale 99% is still not good enough. Errors in either direction will slip by in the noise; especially targeted attacks.

As a secondary step, some systems can provide recourse for errors. Examples might include temporary or shadow bans, rate limiting, error reporting, etc. Unfortunately, cost asymmetry comes into play again. It is far more costly to effectively remediate a mistake than it is to report one. We’re back to cost asymmetry.

All of this is suboptimal. If we had a better solution, it would be in place. Building and maintaining these systems is expensive and won’t go away unless something better comes along.

tl;dr version: assholes ruin it for everyone.

discuss

order

novok|3 years ago

I think a big part of why this is a focus nowadays is because some "community standards" started crossing into political canards as abuse types, so normies who are not spammers are starting to bump into anti-abuse walls, which don't create real appeal processes because that is too expensive. Now the political class is starting to demand expensive things as a result, and they have the guns.

In the past the rules were obvious easy wins like "no child porn" and "no spam" that nobody really gave a shit about most anti-abuse and welcomed it because they never encountered it for their normie behavior.

These platforms to reduce the 'political' costs of their anti-abuse systems need to drop community standards that start becoming political canards, and say that if we are to enforce political canards one way or another, then it has to become law, creating a much higher barrier for the political class to enact because they have another political camp on the other side of the aisle fighting them tooth and nail, because all political canards have multiple sides.

That might mean dropping painful things like coronavirus misinformation enforcement, violent hate speech against LGBT groups in certain countries and even voting manipulation, because you have to let the political class determine the rule set there, not the company itself. Otherwise it will be determined for you, in a really bad way, even in the USA.

tptacek|3 years ago

I mean, all of this might be true or it might not be true, but either way: if Cory Doctorow is appealing to "security through obscurity" to make his argument, he's making a clownish argument.