(no title)
nolemurs | 7 years ago
I'm pretty confident they'd like to but it's a tricky problem. Based on this article, they're downranking based on number of legit DMCA requests - the good sites will generate a lot of signal on that front. The bad/malicious sites won't.
The malicious/scam sites usually come and go quickly, so it's hard to learn which ones they are - if you figure out one site is a scam, and a new one pops up.
A real solution would require being able to make an automatic judgement about whether a site is bad/malicious based on the content of the site, but that's really hard, and it's really bad when you get false positives, so you have to be super conservative about it.
unknown|7 years ago
[deleted]
GW150914|7 years ago
saalweachter|7 years ago
In all seriousness, curation is one of the most valuable skills a human can provide; I do sometimes wish it was easier to find curated content on the internet, but that isn't really what a search engine does. It is something that eg HN and Reddit provide.
Rjevski|7 years ago
nolemurs|7 years ago