top | item 42614992

(no title)

jochem9 | 1 year ago

15 years ago we didn't have so many algorithms. I think Facebook still sorted by recent back then.

The pervasiveness of algorithms is a symptom of the change that has happened. It shows how companies started to compete more and more for our attention and engagement.

discuss

order

the_snooze|1 year ago

I think "algorithms" disguises something bigger: a broad ethos to extract value from users, instead of delivering value to them. Modern consumer tech sees end-users as resources to exploit. Between privacy violations, vendor lock-in, worsening functionality and interoperability, and unilaterally changing terms and designs, these services demonstrate so much contempt for the people who've come to rely on them.

Instead of making things better, tech companies use their engineering prowess to impose control over people and reduce end-users' agency.

BlueTemplar|1 year ago

And people have been warning about these issues back then already : this is what "protocols, not platforms" was about.

(Corporations seeking monopoly is not a new story.)

hibikir|1 year ago

At one point, a company makes changes to just get users, and that makes it have to do things people want. Eventually the growth opportunity by utility gets very small, so all the efforts come from increasing profitability per user. That will rarely make the company's products better for their users. Every top company moved to increasing margins and minimal growth years ago.

sydbarrett74|1 year ago

And this is precisely Doctorow's definition of 'enshittification'.

Zak|1 year ago

Algorithms aren't inherently bad. An algorithm that keeps spam out of my email is pretty great if it works. An algorithm that presents a feed of things I actually want to see is also pretty great. I want to see life updates from my friends and family, especially media of their pets, but not sports, babies, or politics except from a couple people who write interesting things.

That's not a tall order for a recommendation algorithm with a huge supply of training data. It wouldn't even need a 50% hit rate to make something really useful that I would definitely check once or twice a day and post or comment several times a week if even a third of my real life social circle used it.

An algorithm that presents me a feed of things I sometimes can't look away from even though I don't really want to see them is harmful, like an addictive drug.