top | item 17430719

(no title)

nolemurs | 7 years ago

> I wish that if they’re gonna downrank “good” pirate content then they should do the same (or worse) for the bad/malicious ones.

I'm pretty confident they'd like to but it's a tricky problem. Based on this article, they're downranking based on number of legit DMCA requests - the good sites will generate a lot of signal on that front. The bad/malicious sites won't.

The malicious/scam sites usually come and go quickly, so it's hard to learn which ones they are - if you figure out one site is a scam, and a new one pops up.

A real solution would require being able to make an automatic judgement about whether a site is bad/malicious based on the content of the site, but that's really hard, and it's really bad when you get false positives, so you have to be super conservative about it.

discuss

order

GW150914|7 years ago

A real solution would be to employ competent humans rather than build and rely on incompetent algorithms.

saalweachter|7 years ago

Congratulations! You just invented the original Yahoo!.

In all seriousness, curation is one of the most valuable skills a human can provide; I do sometimes wish it was easier to find curated content on the internet, but that isn't really what a search engine does. It is something that eg HN and Reddit provide.

Rjevski|7 years ago

As much as I’m against algorithms on closed platforms and walled gardens, I believe algorithms are the way to go on the open web, especially since the amount of content there can be infinite.

nolemurs|7 years ago

I'm pretty sure the scale of the problem is large enough to make a manual approach completely and utterly impractical.