top | item 28281150

(no title)

ksangeelee | 4 years ago

I did something along those lines as a proof of concept, seeded with links harvested from this site.

http://kakapo.susa.net:8080/cfs/

A similar (and in my opinion more viable) approach is Marginalia Search. This down-scores pages with a large number of scripts, among other heuristics.

https://search.marginalia.nu/

discuss

order

NiceWayToDoIT|4 years ago

I like your proof of concept. I guess, next step would be to take google result, and crawl all their result and convert it into scores, rejecting in the process those that are going outside defined limits. Additionally, a weekly list of most infamous sites would be interesting to see, in the sense what people visit the most (probably social ad driven) but has bad UX...