top | item 45855787

(no title)

StackRanker3000 | 3 months ago

Not saying you’re wrong in this particular instance, but there are all sorts of areas where we accept that harm will occur at scale (e.g. that 40,000 people per year die in motor-vehicle incidents just in the US). How do we determine what is reasonable to expect?

discuss

order

ethbr1|3 months ago

We require auto manufacturers to include certain safety features in their vehicles, to decrease deaths to a socially acceptable level.

The central ill of centralized web platforms is that the US never mandated customer/content SLAs in regulation, even as their size necessitated that as a social good. (I.e. when they became 'too big for alternatives to be alternatives')

It wouldn't be complicated:

   - If you're a platform (host user content) over X revenue...
   - You are required to achieve a minimum SLA for responsiveness
   - You are also required to hit minimum correctness / false positive targets
   - You are also required to implement and facilitate a third-party arbitration mechanism, by which a certified arbitrator (customer's choice) can process a dispute (also with SLAs for responsiveness)
Google, Meta, Apple, Steam, Amazon, etc. could all be better, more effective platforms if they spent more time and money on resolution.

As-is, they invest what current law requires, and we get the current situation.