(no title)
Tenemo
|
1 month ago
I think it is true that it is a real problem (EDIT: but doesn't necessarily make "hosting untenable"), but you are correct to point out that modern pages tend to be horribly optimized (and that's the source of the problem). Even "dynamic" pages using React/Next.js etc. could be pre-rendered and/or cached and/or distributed via CDNs. A simple cache or a CDN should be enough to handle pretty much any scrapping traffic unless you need to do some crazy logic on every page visit – which should almost never be the case on public-facing sites. As an example, my personal site is technically written in React, but it's fully pre-rendered and doesn't even serve JS – it can handle huge amounts of bot/scrapping traffic via its CDN.
consumer451|1 month ago
Or, is that what orgs like Perplexity are doing, but with an LLM API? Meaning that they have their own indexes, but the original q= SERP API concept is a dead end in the market?
Tone: I am asking genuine questions here, not trying to be snarky.
arantius|1 month ago
Also, of course, the amount of spam-for-SEO (pre-slop slop?) as a proportion of what's out there has also grown over time.
IOW: Google has "gotten worse" because the web has gotten worse. Garbage in, garbage out.