top | item 37324773

(no title)

adamweld | 2 years ago

It's not that Google search has necessarily gotten worse, but it's lagging behind the shitiffication of the web caused by SEO and the new features (People Also Ask QA synopses) have absolutely horrible accuracy.

ChatGPT and other LLMs are also absolutely horrible on accuracy, don't get me wrong.

But, here's an example of what's wrong with Google.

I search something like "Can mangoes grow in Washington state?" and at the top of my results is the condensed "People Also Ask" question answer result. These attempt to read and condense a webpage (of questionable accuracy) into an answer for my query, but they are often full of shit.

For example expanding "Where are mangoes grown in WA" shows an answer about Western Australia rather than Washington. Another answer tells me "yes" but when I read the actual article it clearly says "no, they won't survive".

discuss

order

hibikir|2 years ago

Google has been fighting against SEO basically from the beginning. For many years, you could see the difference with other search engines that had worse tech. In general, google used to do very well against SEO bots, for well over a decade.

Today, I think they are losing. Quality primary sources are often crushed by unusable websites, which understand google's analytics very well. If I make it hard to find the information, but I make it seem that it's the next paragraph down, my search results will improve!

Google itself is causing the enshitification of third party websites, many of which have paragraphs and paragraphs that are obvious spam. I'd take any videogame guide website from 2005 over the first page of google today

amoss|2 years ago

Google understood a long time ago that they could not beat SEO, and have been fighting a losing battle ever since. I remember a research presentation from them (might have been late 00s or early 10s) in which they wanted to know: we have an adversary with unlimited resources who can create as many webpages and servers as they want. How do we detect pages in "their" internet as opposed to pages in the "real" internet. The basic answer at the end of the seminar was - you can't. There is no information-theoretic way to do it on the structure of the graph. Instead you need to follow chains of trust, which means that you need roots of trust, which means .... look at the web today, dominated by a handful of known platforms.

dpkirchner|2 years ago

It'd be so, so easy for Google to give us the option to remove sites we don't want in search results. (And by give us I mean return to us, of course.)

It does seem like they've finally removed github scrapers like gitmemory -- at least I haven't seen them in a while.

felix318|2 years ago

I find it interesting that people write Google queries in correct, often polite English as if there is something intelligent on the other side. When I try this query: "where mango grow washington" I seem to get decent results, but the human-sounding query does return garbage.

Perhaps the problem with Google is that it's trying too hard to convince people that it's smarter than it really is. I treat it as a stupid, mindless computer and it works fine most of the time.

worrycue|2 years ago

Yupe, keywords is still how I search most of the time unless I’m literally looking for a question (or something close to it) hoping to hit a Stack Overflow and clones / Reddit post asking that question.

You can tell who all the old people are that were using search engines since back in the day. LOL

All I ever expect from a search engine is it find pages with the words in my query (and maybe exclude certain words as specified), anything else is just gravy.

zoomablemind|2 years ago

Ha, I clearly remember when natural language queries were encouraged for searching. I understood that it was the path to Assistant.

Still, I agree that keyword search remains practical.