top | item 40692075

(no title)

bpm140 | 1 year ago

With all the ad blockers out there, which functionally demonetize content sites, why isn’t there an ad equivalent to robots.txt that says “don’t display this site if ads are blocked”?

So many good comments from several points of view in this thread and the thing I can’t square is the same person championing ad blockers and condemning agents like Perplexity.

discuss

order

qeternity|1 year ago

Because these are all voluntary standards. If you want your content to be discoverable and accessible, you don’t get to dictate how someone renders it. If you want to force monetization, adopt a different business model.

bpm140|1 year ago

I don’t think you’re following my point (I probably explained it poorly).

People voluntarily agreed to follow the robots.txt model when they could have ignore it. To this day, a plurality of people seem to support that standard.

That doesn’t keep content from being discoverable or accessible. All sorts of ways to find web sites outside of sites that use crawlers — directories, web rings, social media, etc.

There could have been an ads.txt model, but people probably would have likely ignored it. Your response would seem to be the norm for defending ad blockers — you somehow have a right to the content and if they can’t force you to view their ad, that’s on them.

Why do people get to dictate who accesses a page but not how it’s accessed? That binary seems completely arbitrary.