top | item 43619768

Ask HN: Do you still use search engines?

350 points| davidkuennen | 10 months ago

Today, I noticed that my behavior has shifted over the past few months. Right now, I exclusively use ChatGPT for any kind of search or question.

Using Google now feels completely lackluster in comparison.

I've noticed the same thing happening in my circle of friends as well—and they don’t even have a technical background.

How about you?

580 comments

order

Some comments were deferred for faster rendering.

wavemode|10 months ago

Search is primarily a portal - you know a particular resource exists, you just don't know its exact URL.

You hear about this new programming language called "Frob", and you assume it must have a website. So you google "Frob language". You hear that there was a plane crash in DC, and assume (CNN/AP/your_favorite_news_site) has almost certainly written an article about it. You google "DC plane crash."

LLMs aren't ever going to replace search for that use case, simply because they're never going to be as convenient.

Where LLMs will take over from search is when it comes to open-ended research - where you don't know in advance where you're going or what you're going to find. I don't really have frequent use cases of this sort, but depending on your occupation it might revolutionize your daily work.

Modified3019|10 months ago

IMO an example of a good use case for an LLM, which would be otherwise very hard to search for, is clarifying vague technical concepts.

Just yesterday I was trying to remember the name of a vague concept I’d forgotten, with my overall question being:

“Is there a technical term in biology for the equilibrium that occurs between plant species producing defensive toxins, and toxin resistance in the insect species that feed on those plants, whereby the plant species never has enough evolutionary pressure to increase it’s toxin load enough to kill off the insect that is adapting to it”

After fruitless searching around because I didn’t have the right things to look for, putting the above in ChatGPT gave an instant reply of exactly what I was looking for:

“Yes, the phenomenon you're describing is often referred to as evolutionary arms race or coevolutionary arms race.”

rstuart4133|10 months ago

I'd go further, and say I use search when I'm pretty confident I know the right search terms. If I don't, I'll type some wordy long explanation of what I want into an LLM and hope for the best.

The reason is pretty simple. If the result you want is in the first few search hits, it's always better. Your query is shorter so there is less typing, the search engine is always faster, the results are far better because you side step the LLM hallucinating as it regurgitates the results it remembers on the page your would have read if you searched.

If you aren't confident of the search times, it can take 1/2 an hour of dicking around with different terms, clicking though a couple of pages of search results for each set of term, until you finally figure out the lingo to use. Figuring out what you are really after from that wordy description is the inner magic of LLM's.

deadbabe|10 months ago

We’re currently in the golden age of LLMs as search engines. Eventually they’ll subtly push products and recommendations in their output to steer you toward specific things.

keithnz|10 months ago

have you tried chatgpt search? you can do "DC plane crash" or "Frob" it will come up with links to the story, but it will quickly give you a summary with links to its sources. Best thing is you can follow up with questions.

npilk|10 months ago

Agreed. I think of these as two different types of searches - “page searches” where you know a page exists and want to get to it, and “content searches” where you have a more open-ended question.

Really, for many “page searches”, a good search engine should just be able to take you immediately to the page. When I search “Tom Hanks IMDB”, there’s no need to see a list of links - there’s obviously one specific page I want to visit.

https://notes.npilk.com/custom-search

generalizations|10 months ago

It's been pretty cool to realize that Grok 3 actually prioritizes up-to-date information: I have actually used it for both kinds of your examples, and it worked.

desipenguin|10 months ago

Agree 100% I tried perplexity to "search" My use case was similar to one described above.

I know what I'm looking for. I just need exact URL.

Perplexity miserably fails at this.

dumbfounder|10 months ago

Yes they will. Why do you think they won’t? They certainly can. You just use RAG to look up the latest news based on the keywords you are using. You can use search on the back end and never surface a list of results unless the LLM decides that is a good idea. It curates that the reusits for you. Or gives you the singular site you need with context. That is better for most searches.

bmcahren|10 months ago

I am going to cite you in a decade. Already today ChatGPT is _far_ better than Google. Instead of finding a keyword optimized page for "frob language", I can get the objectively best sources for frob language and even find the best communities related to it. Zero frob ads, zero frob-optimized pages that are designed to trick google, etc.

Traditional search is dead, semantic search through AI is alive and well.

I can't yet count once AI misunderstood the meaning of my search while Google loves to make assumptions, rewrite my search query, and deliver the results that pay it the best which have the best ads (in my opinion as a lifetime user).

Lets not even mention how they willingly accept misleading ads atop the results which trick the majority of common users into downloading malware and adware on the regular.

crowcroft|10 months ago

Yea 'needle in a haystack' style search is something that LLM based search is simply not as good at.

The reason Google is still seeing growth (in revenue etc.) is that for a lot 'commercial' search still ends with this kind of action.

Take purchasing a power drill for example, you might use an LLM for some research on what drills are best, but when you're actually looking to purchase you probably just want to find the product on Home Depot/Lowe's etc.

kiney|10 months ago

LLMs already replaced that news example for me. Especially grok is really good at summarizing the state of reporting for current events like plane crashes

FloorEgg|10 months ago

Except when search engines bury the thing you're obviously looking for under an entire page of sponsored ads, then that convenience argument starts to not hold up as well...

moralestapia|10 months ago

>LLMs aren't ever going to replace search for that use case, simply because they're never going to be as convenient.

What? On Planet Earth, this is already a thing.

mr_toad|10 months ago

> Search is primarily a portal - you know a particular resource exists, you just don't know its exact URL.

Kind of like a manual, with an index.

RTFM people.

coldtea|10 months ago

>LLMs aren't ever going to replace search for that use case, simply because they're never going to be as convenient.

Sounds trivial to integrate an LLM front end with a search engine backend (probably already done), and be able to type "frob language" and it gives you a curated clickable list of the top resources (language website, official tutorial, reference guide, etc) discarding spam and irrelevant search engine results in the process.

tremarley|10 months ago

If you wanted to know more about a new programming language named “Frob” or a plane crash that happened today, couldn’t you use an LLM like grok?

Or any other LLM that’s continuously trained on trending news?

Okawari|10 months ago

I still prefer tranditional search engines over LLMs but I admit, its results feels worse than it has traditionally.

I don't like LLMs for two reasons:

* I can't really get a feel for the veracity of the information without double checking it. A lot of context I get from just reading results from a traditional search engine is lost when I get an answer from a LLM. I find it somewhat uncomfortable to just accept the answer, and if I have to double check it anyways, the LLM's answer is kind of meaningless and I might as well use a traditional search engine.

* I'm missing out on learning opertunities that I would usually get otherwise by reading or skimming through a larger document trying to find the answer. I appreciate that I skim through a lot of documentation on a regular basis and can recall things that I just happened to read when looking for a solution for another problem. I would hate it if an LLM would drop random tidbits of information when I was looking for concrete answers, but since its a side effect of my information gathering process, I like it.

If I were to use an AI assistant that could help me search and curate the results, instead of trying to answer my question directly. Hopefully in a more sleek way than Perplexity does with its sources feature.

SoftTalker|10 months ago

One thing I don't like about LLMs is that they vomit out a page of prose as filler around the key point which could just be a short sentence.

At least that has been my experience. I admit I don't use LLMs very much.

graemep|10 months ago

> I can't really get a feel for the veracity of the information without double checking it.

This is my main reason for not using LLMs as a replacement for search. I want an accurate answer. I quote often search for legal or regulatory issues, health, scientific issues, specific facts about lots of things. i want authoritative sources.

supportengineer|10 months ago

Am I the only one who double checks all of the information presented to me, from any source?

leptons|10 months ago

>A lot of context I get from just reading results from a traditional search engine is lost when I get an answer from a LLM. I find it somewhat uncomfortable to just accept the answer, and if I have to double check it anyways, the LLM's answer is kind of meaningless and I might as well use a traditional search engine.

And this is how LLMs perform when LLM-rot hasn't even become widely pervasive yet. As time goes on and LLMs regurgitate into themselves, they will become even less trustworthy. I really can't trust what an LLM says, especially when it matters, and the more it lies, the more I can't trust them.

bluGill|10 months ago

I find LLMs useful for the case where I'm not sure what the right terms are. I can describe something and the LLM gives me a term which I then type into a search engine to get more information. I'm only starting to use LLMs though, so maybe I'll use them more in the future? - only time will tell.

miloignis|10 months ago

Yes, I use search engine(s) constantly - namely Kagi, which really does feel like Google used to. I tried using LLMs for a recent project of mine when I was trying to figure out if something was possible, and they were actively misleading, every time. My issue for this project was that what I was asking for did end up not being currently possible, but LLMs wouldn't tell me that and would make up incorrect ways to solve my problem, since they didn't want to tell me it couldn't be done.

Really, these days, either I know some resource exists and I want to find it, in which case a search engine makes much more sense than an LLM which might hallucinate, or I want to know if something is possible / how to do it, and the LLM will again hallucinate an incorrect way to do it.

I've only found LLMs useful for translation, transcription, natural language interface, etc.

NelsonMinar|10 months ago

My experience too. The problem isn't search, it's Google. Kagi really is very useful. I use LLMs for some things but still lots of Kagi search.

marvinblum|10 months ago

It's the same for me. I've switched to DuckDuckGo about 2 or 3 years ago and it feels like Google used to. I'm always shocked to see how bad the results are and how cluttered the top section is on Google if I happen to search there on someone else's computer.

LLMs have mostly been useful for three things: single line code completion (in GoLand), quickly translating JSON, and generating/optimizing marketing texts.

averageRoyalty|10 months ago

Agreed, for resource location, Kagi feels like Google did 20 years ago.

I use LLMs as a sounding board. Often if I'm trying to tease out the shape of a concept in my head, it's best to write it out. I now do this in the form of a question or request for information and dump it into the LLM.

jlbang|10 months ago

Fully agreed. I pay for Kagi monthly now, and it's totally worth it. I really hope they grow and become more well known because they doing what one of the biggest companies in the world is doing, and doing it better, and it seems so few people even know about them.

star-glider|10 months ago

This is my favorite thing about Kagi; you can do both. If you just append a question mark, it'll run the search through a simple LLM and give you those results (with citations) right before standard search. From there, you can proceed into a more sophisticated frontier model if that's more effective.

"Search" can mean a lot of things. Sometimes I just want a website but can't remember the URL (traditional); other times I want an answer (LLMs); and other times, I want a bunch of resources to learn more (search+LLMs).

sshine|10 months ago

And sometimes you have all the data, but it’s too much, so you ask for a summary and ask elaborating questions.

bayindirh|10 months ago

I use Kagi exclusively and refuse to offload my brain to a thing which has no accuracy guarantee ever. The emitted answers to the queries it has given can be completely bogus, and developers of these things low key expect me to believe what their black box say? Nah, never.

Instead I use a search engine and do my own reading and filtering. This way I learn what I'm researching, too, so I don't fall into the vicious cycle of drug abu ^H^H^H^H^H laziness. Otherwise I'll inevitably rely more on more on that thing, and be a prisoner of my own doing by increasingly offloading my tasks to a black box and be dependent on it.

drpixie|10 months ago

100% agree.

Google recently (unrequested) provided me with very detailed AI generated instructions for server config - instructions that would have completely blown away the server. There will be someone out there who just follows the bouncing ball, I hope they've got good friends, understanding colleagues, and good backups!

tasuki|10 months ago

> I use Kagi exclusively and refuse to offload my brain to a thing which has no accuracy guarantee ever.

What a weird sentence. What accuracy guarantees does Kagi have? Or, if you're not "offloading your brain to it", can't you do the same with an LLM?

EliasWatson|10 months ago

Google results have gotten so terrible over the years. I switched to Kagi long ago and haven't looked back. Whenever I use Google on another computer, I'm shocked by how awful the results are compared to Kagi.

As for AI search, I do find it extremely useful when I don't know the right words to search for. The LLM will instantly figure out what I'm trying to say.

sshine|10 months ago

Probably 70% of my searches are FastGPT searches, meaning I end my search query with a ‘?’ and Kagi summarises the results, so I don’t need to click.

And the ratio between using search engine and Kagi’s LLM agent with search is still 70% search. Sometimes, searching is faster, sometimes asking AI is faster.

jacobmarble|10 months ago

Same. I switched to Kagi over a year ago, and now every other search engine looks like a steaming pile of ads and slop.

tiborsaas|10 months ago

I'm the inverse, I still 90% of the time use search engines, mostly Google. LLM-s can't help me with researching Hungarian companies offering screws, furniture, TV-s etc I need for my home renovation. It can't find me the best route to go to a cafe, lookup users, find information on famous people. Google is also faster than me typing a good prompt.

I use LLM-s for what they are good at, generative stuff. I know some task take me a long time and I can shortcut with LLM-s easily.

So here's a ChatGPT example query* which is completely off:

https://chatgpt.com/share/67f5a071-53bc-8013-9c32-25cc2857e5...

* It's intentionally bad be able to compare with Google.

And here's the web result, which is spot on:

https://imgur.com/a/6ELOeS1

zer00eyz|10 months ago

My take is close to yours...

LLM's are great when you want AN answer, and not get side tracked.

Search is great when you want to know what answers are out there. The best example is Recipes... From what spices go into chai to the spice mix in any given version of chili (let's not start on beans).

The former is filling in missing knowledge the latter is learning.

yellowapple|10 months ago

LLMs are still notorious for hallucination; last I checked ChatGPT in particular still hallucinates about 1/3rd of the time.

So yeah, I do still use search engines, specifically Kagi and (as a fallback) DuckDuckGo. From either of them I might tack on a !g if I'm dissatisfied with the results, but it's pretty rare for Google's results to be any better.

When I do use an LLM, it's specifically for churning through some unstructured text for specific answers about it, with the understanding that I'll want to verify those answers myself. An LLM's great for taking queries like "What parts of this document talk about $FOO?" and spitting out a list of excerpts that discuss $FOO that I can then go back and spot-check myself for accuracy.

mepian|10 months ago

Yes, I'm still using Google as I haven't found LLMs useful as a search engine replacement.

dowager_dan99|10 months ago

but the AI responses from google that dominate the screen real estate are terrible. When you repeat the exact same query and get changing (but all wrong) answers, something is broken. I've resorted to including profanity in all my searches to prevent the AI responses, which is suboptimal at work...

stonemetal12|10 months ago

No, I find it unwilling to produce factual information.

For example Jeep consistently lands at the bottom of the reliability ratings. Try asking GPT if Jeeps are reliable. The response reads like Jeep advertising.

s1artibartfast|10 months ago

GPt models have a people-pleasing bias and positivity bias. If you want factual information, you have to modify your prompt. I imagine you would get very different results if you asked "are Jeeps more reliable than Toyotas", or "how do Jeeps compare to the median car in terms of reliability"

My impression is that different llms are more or less people pleasing. I found grok is more willing to tell me something is a bad idea.

0xbadcafebee|10 months ago

Instead ask it to show you links to websites that review reliability ratings and highlight the results for Jeeps along with sources. It's annoying, but how you ask it questions is often more important than what you're asking. (This was a thing when search engines were first introduced too)

marcusverus|10 months ago

LLMs are like humans. They don't know what you mean, only what you say. They can't tell you what you want to know, they can only answer the question you actually ask! The question you asked is broad and phrased in a way that begs a simplistic answer about the entire brand. Obviously an answer to that question will do a worse job of laying out the relative reliability of current jeep models than would a report which was created to address that specific question.

If you want to know how modern Jeep models stack up against their peers in terms of reliability, try asking GPT that question!

RattlesnakeJake|10 months ago

I use DuckDuckGo, with the occasional reddit !g appended if I'm looking for something experience-based.

For me, searches fall into one of three categories, none of which are a good fit for LLMs:

1. A single business, location, object, or concept (I really just want the Google Maps or Wikipedia page, and I'm too lazy to go straight to the site). For these queries, LLMs are either overkill or outdated.

2. Product reviews, setup instructions, and other real-world blog posts. LLMs want to summarize these, and I don't want that.

3. Really specific knowledge in a limited domain ("2017 Kia Sedona automatic sliding door motor replacement steps," "Can I exit a Queue-Triggered Azure Function without removing it from the queue?"). In these cases, the LLMs are so prone to hallucination that I can't trust them.

georgemcbay|10 months ago

I still use google but I pretty much always append site:reddit.com to the query.

The answer I'm seeking is not always on reddit itself, but google limited to reddit is far more likely to give me quality starting links than google unbound is.

supportengineer|10 months ago

In response to the final sentence, you can work around this by breaking your problem or question down into smaller pieces. Essentially forcing it to reason, manually.

hbn|10 months ago

I've tried DDG but it doesn't seem much different from the results on Google/Bing (as I understand it they use Bing's search index anyway?)

inferiorhuman|10 months ago

Unfortunately DDG still enshittifies their results with AI garbage as much as they can. Both with the intrusive AI blob at the top and replacing the page summaries with dreadful AI goobleygook.

anoldperson|10 months ago

AFAICT ChatGPT is mostly useless and can't be trusted to answer questions accurately. So no, mostly all search engines. To be honesty I'm surprised anybody uses it for anything other than trivial uses.

jacobr1|10 months ago

Are you using a paid version? Do you use web-search? And have you tried alternatives like Claude?

I've mostly switched to using Claude these days, with MCPs for websearch and fetching specific remote or local files. It answers questions generally very accurately (from the source documents it identifies) and includes citations.

I've found that people that haven't really tried the latest models, and just rely on whatever knowledge is in the model training are really missing out on the potential power. GPT4o+ and equivalent models really changed the game. And using tools to do a search, or pull in your code, or run a db query or whatever enables them to either synthesize information or generate context relevant material. Not perfect for everything, but much better than a year ago, or what people are doing with the free systems.

0xbadcafebee|10 months ago

Search engines aren't accurate either, they show you 10,000+ pages for your search query. You probably weren't looking for 10,000 answers. The problem is, they can't read your mind. ChatGPT can show you results you want, just like search engines can, you just may have to tweak your query.

IAmGraydon|10 months ago

It answers questions extremely accurately in my experience. It's improved a lot in just the last few months.

senko|10 months ago

I use (and pay for) Kagi.

Even without much customization (lenses, scoring, etc) it's so much better (for my use cases) I happily pay for it.

Recently I have also started to use Perplexity more for "research for a few minutes and get back to me" type of things.

Queries like "what was that Python package for X" I usually ask an AI right from my editor, or ChatGPT if I'm in the browser already.

disambiguation|10 months ago

I use both, but direct search is faster since I have to fact check the LLM's answer.

2 recent success stories:

I was toying around with an esp32 - i was experimenting to turn it into a bluetooth remote control device. The online guides help to an extent, setting up and running sample projects, but the segue into deploying my own code was less clear. LLMs are "expert beginners" so this was a perfect request for it. I was able to jump from demos to live deploying my own code very quickly.

Another time I was tinkering with opnsense and setting up VLANs. The router config is easy enough but what I didnt realize before diving in was that the switch and access point require configuration too. What's difficult about searching this kind of problem is that most of the info is buried in old blog posts and forum threads and requires a lot of digging and piecing together disparate details. I wasn't lucky enough to find someone who did a writeup with my exact setup, but since LLMs are trained on all these old message boards, this was again a perfect prompt playing to its strengths.

footy|10 months ago

I use Kagi and sometimes DDG. When I do a search I'd rather do my own reading than be lied to. It's not even like using it for code, when you can quickly iterate if needed-there is no way to verify the information you got is correct and that is a major problem imo.

matt_trentini|10 months ago

Using search engines are still _significantly_ faster for me for the vast majority of the queries I want answers for.

The results from LLMs are still too slow, vary too much in quality and still frequently hallucinate.

My typical use-case is that when I'm looking for an answer I make a search query, sometimes a few. Then scan through the list of results and open tabs for the most promising of them - often recognising trusted, or at least familiar, sites. I then scan through those tabs for the best results. It turns out I can scan rapidly - that whole process only takes a few seconds, maybe a minute for the more complex queries.

I've found LLMs are good when you have open-ended questions, when you're not really sure what you're looking for. They can help narrow the search space.

joseda-hg|10 months ago

I use Kagi, but I will say, the Quick Answer (Place an interrogation after your query for an LLM based answer) has been way more useful than I initially thought

saaaaaam|10 months ago

Do you really though? Because I had this conversation recently and she was still typing stuff into the browser bar in chrome and then clicking on stuff from search results. I think a lot of folk think ChatGPT has superseded search but they don’t realise they are still carrying out a load of low level or transactional search queries via chrome.

jpc0|10 months ago

100% still search first. If I am not super knowledge on the domain I am searching for I use an AI to get me keywords and terminology and then search.

At most I use AI now to speed up my research phase dramatically. AI is also pretty good at showing what is in the ballpark for more popular tools.

However I am missing forum style communities more and more, sometimes I don't want the correct answer, I want to know what someone that has been in the trenches for 10 years has to say, for my day job I can just make a phone call but for hobbies, side projects etc I don't have the contacts built up and I don't always have local interest groups that I can tap for knowledge.

GuB-42|10 months ago

I only use LLMs when I don't know what I am looking for. Otherwise, search engines all the way.

LLMs can't be trusted, you have no way to tell between a correct answer and a hallucination. Which means I often end up searching what the LLM told me just to check, and it is often wrong.

Search engines can also lead you to false information, but you have a lot more context. For example, a StackOverflow answer has comments, and often, they point out important nuances and inaccuracies. You can also cross-reference different websites, and gauge how reliable the information is (ex: primary source vs Reddit post). A well trained LLM can do that implicitly, but you have no idea how it did for your particular case.

pizzly|10 months ago

I use LLM for almost everything from summarizing, finding out terminology in very different fields that I have no knowledge of, initial research of any field, coding (no more google searching of stack overflow anymore), pretty much everything. I use search engine only for searching for companies/products due to some mistrust that it would find all the products or companies including very small companies/products. But I am very open to removing search engines completely if this last point is satisfied.

foragerdev|10 months ago

I still use search engines. I do not like to be spoon fed. I want to learn from the real people. Not AI generated shit. ChatGPT or other LLMs are trained on old data, they do not contain newer information. Newer information and knowledge are produced by the real humans. LLMs are great for quick fact checking, but not for searching. For example: What's the height of mount Everest? LLMs will most probably give right answer.

What are the specs for new Goolge Pixel 9a? LLM can't answer this may after a year they can.

rrr_oh_man|10 months ago

> Newer information and knowledge are produced by the real humans.

Not anymore

dcl|10 months ago

Search engines are still required for me. LLM's still get lots of very important things wrong.

Last night, I asked Claude 3.7 Sonnet to obtain historical gold prices in AUD and the ASX200 TR index values and plot the ratio of them, it got all of the tickers wrong - I had to google (it then got a bunch of other stuff wrong in the code).

Also yesterday, I was preparing a brief summary of forecasting metrics/measures for a stakeholder and it incorrectly described the properties of SMAPE (easily validated by checking Wikipedia).

I constantly have issues with my direct reports writing code using LLM's. They constantly hallucinate things for some of the SDK's we use.

throwaway422432|10 months ago

Asking for a list of companies in a specific sector also gives you made up tickers, or at best a list it found on some blog.

Was a bit more useful at questions like "Rank these stocks by exposure to the Chinese market", as you can prioritise your own research but in the end you just have to go through the individual company filings yourself.

caseyy|10 months ago

I was among the first to champion AI search, even before Perplexity rose to fame. You.com was the first AI search to quote sources well, and I used it extensively.

But now, the veracity of most LLMs' responses is terrible. They often include “sources” unrelated to what they say and make up hallucinations when I search for what I'm an expert in. Even Gemini in Google Search told me yesterday that Ada Lovelace invented the first programming language in the 18th century. The trust is completely gone.

So, I'm back to the plain old search. At least it doesn't obscure its sources, and I can get a sense of the veracity of what I find.

esperent|10 months ago

> The trust is completely gone

I mean, for everyone else it was never there to begin with. Hallucinations are constantly raised as the biggest issues with AI. According to the tests, and my experience, newer AI models are objectively better, not worse than the ones from a few years ago. They still have a long way to go/may never be fully trustworthy though.

What I have lost trust in, and what I and many feel has become much worse over the few years, is Google search, and all the other search engines that are based on it.

axegon_|10 months ago

Yes, I do. I'd never use an LLM for any meaningful or important information because by design, they will simply shove the most likely next token and you get a ton of responses which are pure nonsense if you start digging into it. Mind you, I've been noticing that Google has been terrible for a long while now. Kagi seemed alright at first but it also gave a lot of nonsense. The final straw for Kagi was the fact that they are backed by yandex and by extension fund the russian terrorist state. Lately I've switched to Qwant and so far it covers almost all of my needs.

mavamaarten|10 months ago

I've just recently switched to Qwant from Google. It serves all my purposes perfectly, except for programming queries unfortunately.

Google seems to be better at bringing up a variety of stackoverflow and blogposts relevant to my search queries. Qwant seems to struggle exactly with that: it's great at giving exactly what I was searching for but that's sometimes not what I was looking for, if you get what I mean.

In a sense, LLM's are actually perfect for that. But like you say, the super confident hallucinations are just too frustrating. Literally every time I've asked it a serious programming question, it's hallucinated an API that doesn't exist. Everyone seems to be focusing on letting LLM's solve math and thinking problems. That's exactly what _I'm_ good at. I would much rather have an LLM that is good at combining sources and giving me facts (knowledge, rather than thinking) while most of all being able to say "I don't know".

throwawa14223|10 months ago

I use Kagi and I don't think I'd notice if quick answer disappeared. Most of the time I have an answer in the time it would take for GPT to present a prompt.

geocrasher|10 months ago

I use search engines (Google) and when they (it) fails to provide me the responses I need, I turn to ChatGPT. For example:

I recently upgraded my video card, and I run a 4K display. Suddenly the display was randomly disconnecting until I restarted the monitor. I googled my brains out trying to figure out the issue, and got nowhere.

So I gave ChatGPT a shot. I told it exactly what I upgraded from/to, and which monitor I have, and it said "Oh, your HDMI 2.0 cable is specced to work, but AMD cards love HDMI2.1 especially ones that are grounded, so go get one of those even if it's overspecced for your setup."

So I did what it said, and it worked.

juliangmp|10 months ago

I don't even consider using a llm as a search engine. But I do agree, google has declined drastically in quality. Personally I'm on duckduckgo, though it always depends on the topics you search for.

neilsimp1|10 months ago

Yes, DDG for 95% of issues. Using an AI to search seems really, really, really dumb to me.

nailer|10 months ago

That’s a perfectly fine answer, but providing no supporting arguments makes this a very difficult conversation.

jrvarela56|10 months ago

I’d say come back in a few years for a bad take. But this is already a bad take.

A query in a regular search engine can at best perform like an LLM-based provider like Perplexity for simple queries.

If you have to click or browse several results forget it, makes no sense not to use an LLM that provides sources.

Delk|10 months ago

Yes. I like being able to evaluate my sources. For programming or other technical topics, I'll rather read the original documentation, or third-party information whose credibility I can have some idea about.

For other topics, exact pedantic correctness may not always be as important, but I definitely do want to be able to evaluate my sources nevertheless, for other obvious reasons.

Search is actually pretty much what I want: a condensed list of possible sources of information for whatever I'm looking for. I can then build my own understanding of the topic by checking the sources and judging their credibility. Search seems to have been getting worse lately, sadly, but it's still useful.

s1artibartfast|10 months ago

I feel like it has been years since search was useful for finding original documentation and sources. Around the time fuzzy responses were introduced and quote searching was removed.

farm42|10 months ago

Transforming Research with AI PDF Summary

Meet Sarah – a researcher burning the midnight oil, trying to extract insights from a 50-page paper. She’s exhausted, overwhelmed by the endless paragraphs, trying to condense hours of reading into key findings for her thesis. Sound familiar?

If you’re tired of drowning in long PDFs, AI PDF Summary is here to change your workflow. In just one click, AI PDF Summary helps you:

Instant Analysis: Summarize documents and highlight essential information in seconds. Key Insights: Quickly access the most important findings, methodologies, and conclusions. Automatic References: AI extracts references from your PDF, saving you hours of tedious work.

Stop wasting time, and start using AI PDF Summary for faster, more efficient research.

Try it now: https://clck.ru/3LQuDa

leephillips|10 months ago

I subscribe to https://kagi.com/. I use search to find expert and authoritative sources of information with human authors who can be held responsible for their contents, and that I can cite in my own work. I’m not interested in the output of a copy-paste machine that steals others’ work, makes things up, and spits out prose worse than a politician’s.

jasonvorhe|10 months ago

Kagi with a lengthy exclusion/block list (fact checkers, Pinterest, etc), Brave Search, DDG as a rare 3rd option. Not using any explicit AI search engines like Perplexity, but I make use of Kagi's summaries a lot.

layman51|10 months ago

Search engines have gotten worse but they are still much more helpful for finding certain resources compared to LLMs. I am fond of the search operators that still work like `filetype:pdf`, site:example.com`, `intitle:trailmix`.

If they get rid of those operators, then that would be really bad. But I have a feeling that’s what a lot of search engine people are itching to do.

trumbitta2|10 months ago

Nah. I'm perfectly conscious of the fact that ChatGPT can't be trusted with searches. Google is still my daily driver.

bromuro|10 months ago

My ChatGPT “searches the web” and provides URL of the sources as well.

milesvp|10 months ago

There is a class of problems I no longer use search for. I find LLM’s give really good results for things like command line usage. Or even things like configuring an application. Basically anything that can summarize lots of disperate sources.

Conversely it’s a huge mistake to rely on LLMs for anything that requires authoritative content. It’s not good at appropriately discounting low quality sources in my experience. Google can have a similar problem, but I find it easier to find good sources there first for many topics.

Where LLMs really replace modern google is for topics you only kind of think should exist. Google used to show some pretty tenuously related links by the time you got to page 5 results and there you might find terms that bring you closer to what you’re looking for. Google simply doesn’t do that anymore. So for me, one of the joys is being able to explore topics in a way I haven’t been able to for over a decade

alganet|10 months ago

Search engines are like older versions of LLMs. They are not designed as human-like assistants, but their goal of providing answers is similar in nature.

Search engines tend to over summarize less, and provide lots of references. Something LLM researchers worked hard to achieve.

If they feel lackluster for you, maybe you are not interested in those specific use cases in which they shine.

Similarly, the reason could be that you don't want to check references for yourself, and you prefer to trust the selection of cross references provided by your LLM of choice.

It is likely that your close circle of friends share an identity similar to yours. That is, by many, considered a defining characteristic of friendship. Although it can be a sign of the rising popularity of LLMs, one must take it as an anecdote and not a statistically significant fact.

I do prefer a soft selection of queries on different search engines and different LLM models. Since you asked for an opinion and self-declared an ability to do searches and questions yourself, I don't feel obligated to cite sources for this answer.

sonorous_sub|10 months ago

I use ChatGPT sometimes, but only after I've exhausted google's results for my search and not found the answer I was looking for, or when the query is so obscure that the enhanced problem solving ability of ChatGPT warrants going to it first. I like ChatGPT for solving mundane math problems because I can check its work, and getting the answers I need that way are quicker than doing it myself manually. I still don't trust ChatGPT for anything subjective, because I get spurious results from it anytime the answer to my question is not cut and dry. But what it can do, it does well.

I don't have a circle of friends, so I have no idea what other people are doing, outside of what I read online.

jemmyw|10 months ago

I use search engines all the time (kagi specifically). AIs don't have up to date information. How would you find reviews for products via an AI? It'll just come up with one or two, and when you read you can pick out nuance and also tell if its a genuine review or made up garbage. Or find a place to buy something. Or a place to go, and to read other peoples comments on it. Summaries aren't very useful over comments imo.

I use an LLM a lot for coding. However, I was never as much into doing web searches for programming problems anyway, I used docs more and rarely needed sites like SO. I haven't therefore moved away from search engines for that side of things.

plsbenice34|10 months ago

I use a search engine 99% of the time. Occasionally i use an LLM, but even for checking the most simple information I am not able to have any confidence that the answer it gives is correct. It seems to lie to me every time i use it and contradicts itself when i tell it that it made an error. It provides no citation to where it got its information and that seems completely essential. I very rarely see any use for it. Even if a search engine is much slower i will not compromise on knowing where the information is sourced from so I can judge its accuracy, bias, etc. I feel disturbed by all the people that have lower information standards

Slash65|10 months ago

I think this is the real problem. If it’s unsourced, how can I verify the LLM isn’t hallucinating. That being said I started running open web ui to host models locally and have heard that some will source their content(I don’t know which, I haven’t hosted them yet) so that is promising. I also like hosting deep seek locally and being able to review its logic process so I can assess how it arrived at its conclusions. All that to say, I still use a traditional search (self hosted version of searxng) for 95% of my search. I like llms for bouncing ideas around, but not for finding accurate results quickly

thefz|10 months ago

Yes, LLMs are no match for my decades of search skills.

Koshcheiushko|10 months ago

> decades of search skills.

By decades, I assume atleast 2. So minm 20 years. I'm very interested to know about your experience.

Would you please elaborate how do you filter or specifically what techniques you use to get your desired result?

Thanks.

Cthulhu_|10 months ago

I use search engines, but that's because I just yeet in a few words and I get a result, either directly through the preview or after a click through to the results.

With chatbots I first need to formulate a question (or, I feel like I do), then wait for it to slowly churn out an overly wordy response. Or I need to prompt it first to keep it short.

I suppose this difference is different if you already used a search engine by asking it a fully formulated question like "What is a html fieldset and how do I use it?" instead of "html fieldset" and clicking through to MDN.

notepad0x90|10 months ago

For those same questions that you're probably asking ChatGPT, a google search would show me google's LLM answer at top, maybe reddit threads that would illuminate the topic a bit more for me, maybe stackoverflow threads where 2-3 people show different approaches to the solution and maybe some random forum somewhere with example code i could repurpose. Sure, chatgpt will answer the question but it won't have all the other noise that I can glean from and maybe come up with a better solution.

I would use the analogy of consuming a perfectly tasty and nutritional meal crafted by chef chatgpt vs visiting a few restaurants around your neighborhood and tasting different cuisines. neither approach is wrong but you get different things and values out of each approach. Do what you feel like doing!

Last week, there was a specific coding problem I needed help with, I asked chatgpt which gave me a great answer. Except I spent a few hours trying to figure out why the function chatgpt was using wasn't being included, despite the #include directives being all correct. neither chatgpt nor google were helpful. The solution was to just take a different approach to my code, if I only googled, I wouldn't have spent that time chasing the wrong solution.

Also consider this, when you ask a question, there are a bunch of rude people (well meaning) that ask you questions like "what are you really trying to do?" and who criticize a bunch of unrelated things about your code/approach/question. a lot of times that's just annoying but sometimes that gives you really good insights on the problem domain.

maxehmookau|10 months ago

I fundamentally cannot trust a searching system that includes a disclaimer that it can make stuff up (hallucinate) and there's nothing you can do about it.

mooreds|10 months ago

I 100% use search engines, especially to find doc that I know exists. Google/DDG are so fast.

If it is more of an open ended question that I am not sure there'll be a page with an answer for, I am more likely to use ChatGPT/Claude.

rossdavidh|10 months ago

I use ChatGPT only occasionally, mostly for laughs, but primarily use Google. It's not as good as it used to be, but it is still the best available. I think there is an opening for a new search engine company now (unlike 10 years ago when Google was unbeatable), and I suppose LLM's might be a part of it. ChatGPT is not it, though.

Same with my wife (non-technical) and teenage daughter.

agentultra|10 months ago

Yes. Why would I use AI to find information?

jillesvangurp|10 months ago

Because AIs can be faster and more exhaustive than you'll ever be. It's really good at those needle in the haystack type searches that would take ages doing manually.

You don't want AIs reproducing information necessarily. But they are really great at interpreting your query, digging out the best links and references using a search engine and then coming up with an answer complete with links that back that up.

I'd suggest just giving perplexity a spin for a few days. Just go nuts with it; don't hold back. It's one of the better AI driven search tools I've seen.

quadsteel|10 months ago

It depends on the type of query, anything has to do with locality or recency, LLMs just don't _really_ work all that well, or even at all.

Someone at work yesterday asked me if I knew which bus lines would be active today due to the ongoing strike. Googled, got a result, shared back in under 10 seconds.

Out of curiosity I just checked with various LLMs through t3.chat, with all kinds of features, none had anything more than a vague "check with local news" to say. Last one I tried Gemini with Deep Research and what do you know, it actually found the information and it was correct!

It also took nearly 5 minutes..

Like I feel if your search is about _reality_ (what X product should I buy, is this restoraunt good, when is A event in B city, recipes etc.) then LLMs are severely lacking.

Too slow, almost always incomplete answers if not straight up incorrect, deep research tends to work if you have 20 minutes to spare both to get an initial answer and manually go and vet the sources/look for more information in them.

DavidaGinter|10 months ago

I'm using ChatGPT or Perplexity as my defaults for any research/questions I have (open research). I do go to Google when I have a specific company I want to quickly check some details (close research).

deevus|10 months ago

The trend of using LLMs for everything feels like a "when all you have is a hammer, everything starts to look like a nail" situation.

People should do what makes them feel good, but I think we're all going to get a bit dumber if we rely too much on LLMs for our information.

I personally still use search engines daily when I know what it is that I am searching for. I am actually finding that I am reaching less for LLMs even though it is getting easier and cheaper (I pay for T3 Chat at $8USD p/m).

Where I find LLMs useful is when I am trying to unpack a concept or I can't remember the name of something. The result of these chats often lead to their own Google searches. Even after all this development, the best LLMs still hallucinate constantly. The best way that I've found to reduce hallucinations is to use better prompts. I have used https://promptcowboy.ai/ to some success for this.

Manfred|10 months ago

I don’t use LLMs for factual information at all because it is likely biased or wrong.

JohnFen|10 months ago

Yes, I still use search engines. So do all but one of my friends, both technical and not. I have not found LLMs to be anything close to a good replacement for them.

promiseofbeans|10 months ago

I use Kagi search when I want to find something, and chatgpt free when I want a question answered.

II2II|10 months ago

Most of my searches still use traditional search engines for two reasons:

- If I am seeking technical information, I would rather get it from the original source. It is often possible to do that with a search. The output from an LLM is not going to be the original source. Even with dealing with secondary sources, it is typically easier to spot red flags in a secondary source than it is with the output of an LLM.

- I often perform image searches. I have no desire for generated images, though I'm not going to object to one if someone else "curated" the outputs of an AI model.

That said, I will use an LLM for things that aren't strictly factual. i.e. I can judge if it is good enough for my needs by simply reading it over.

legohead|10 months ago

Mostly GPT, but for World of Warcraft, GPT is absolutely horrible. It's like it has been corrupted by the 20 years of bad/incorrect user data, or maybe just the sheer amount of it in general.

As an example, someone typo'd an abbreviation, so I asked GPT and it gladly made up something for me. So I gave it a random abbreviation, and it did the same (using its knowledge of the game).

Even when I tell it the specific version I'm playing it gets so much wrong it's basically useless. Item stats, where mobs are located, how to do a certain quest - anything. So I'm back to using websites like wowhead and google.

tiffanyh|10 months ago

Yes.

Until LLMs stop responding with over confident “MBA talk” that sounds impressive but doesn’t really say much, I’ll continue to use search engines.

maximilianburke|10 months ago

All the time. I don't like LLMs, and don't trust them. I tried to use copilot but ended up shutting it off because I spent more time trying to decipher and ignore its (wrong) suggestions than I did solving the problem.

ChrisArchitect|10 months ago

GPT is completely useless for most of my daily searches. Searching for specific content on a site? I can just put in site:domain.com keywords and get useful results without having to read useless overview paragraphs about the site in question.

Image searches without having to describe every minute detail of what I'm looking for?

Bah, even some searches that are basically looking for wikipedia/historical lookups....so much easier UI in Google Search than chatgpt's endless paragraphs with unclear sources etc.

For some things Google's AI results are helpful too, if not to just narrow down the results to certain sources.

There's no chat interface helping any of this

lqstuart|10 months ago

People started using search engines to ask stupid questions. An LLM like Gemini etc is hands-down better for that. A search engine is still better for actually searching. I do not need a 5000 word screed about a guacamole recipe.

oldjim69|10 months ago

Why would I ever search on a ChatGPT - thats not what they are for. They are for helping summarize things, writing copy, designing excel. Making silly images.

Search is for finding specific websites and products. Totally different things.

whyITS|10 months ago

For sure. For me, searching is learning. I think, there is not only one exact answer to my question. Also because I am not asking my question absolutely correctly, as such precision would take a lot of time. So while reading through the list of possible answers I often develop ideas and/or paths that lead to a possible solution. This is good for my fantasy and for future structuring of my search processes or research considerations. I also occasionally have the opportunity to discover information that is important for my other projects e.g. some days ago I searched for a group of software-filters that find borders in groups of pixels where the pixels build more a soft cloud than a separation between rock and water. I found a company - Tempus AI - that developed a successful working medical AI...so I bought this share...

coderjames|10 months ago

I use DDG multiple times a day, every day. I don't find ChatGPT to be a suitable substitute for helping me locate resources on the web; hallucinated links waste my time trying to get to useful information.

Hikikomori|10 months ago

Tr kagi, its pretty good with filters down/upranking sites. I usually don't use ai for search purposes very much, mostly to avoid multiple pages of docs by asking it how to do things.

mikrl|10 months ago

Yep. I ask LLMs the XY questions since they don’t get annoyed, and when my question is very concrete and reduced to its essence, I ask the search engine and usually get a better answer than the LLM would give me.

Basically, there’s a lot of good and specific information on the web, but not necessarily combined in the way I want. LLMs can help break apart my specific combination at a high level but struggle with the human ability to get to solutions quickly.

Or maybe I just suck at asking questions haah

entropyneur|10 months ago

Do you have that friend who knows the answer to anything and who you thought was a genius until smartphones appeared and you started googling his answers? LLMs are that guy.

For programming stuff that can be immediately verified LLMs are good. They also cover many cases where search engines can't go (e.g. "what was that song where X did Y?"). But looking up facts? Not yet. Burned many times and not trying it again until I hear something changed fundamentally.

superkuh|10 months ago

I still use google scholar, right dao for deep search (tens of thousands of results), searx instances, and kagi for now but it's not worth the $10/mo for only ~200 results per search.

The serendipity of doing search with your own eyes and brain on page 34 of the results cannot be understated. Web surfing is good and does things that curated results (ie, google's <400, bing's <900, kagi's <200, LLM's very limited single results) cannot.

miki123211|10 months ago

Here's what I do:

1. questions where I expect SEO crap, like for cooking recipes, are for LLMs. I use the best available LLM for those to avoid hallucinations as much as possible, 2.5 pro these days. With so much blogspam, LLMs are actually less likely to hallucinate at this point than the real internet IMO.

2. Questions whose answer I can immediately verify, like "how do I do x in language y", also go to an LLM. If the suggestion doesn't work, then I google. My stackoverflow usage has fallen to almost 0.

3. General overviews / "how is this algorithm called" / "is there a library that does x" are LLMs, usually followed by Googling about the solutions discussed.

4. When there's no answer to my exact question anywhere, or when I need a more detailed overview of a new library / language, I still read tutorials and reference docs.

5. Local / company stuff, things like "when is this place open and how do I call them" or "what is the refund policy of this store" are exclusively Google. Same for shopping (not an American, so LLM shopping comparisons aren't very useful to me). Sadly, online reviews are still a cesspool.

akaike|10 months ago

It’s tough to find anything useful these days because of all the spam - especially due to AI, content. If I do use it, I usually use it to find something on Reddit.

Freak_NL|10 months ago

For anything where practical skills are concerned (woodworking, metalworking, leatherworking, anodising stuff, etc.) I have to resort to searching on Youtube. There is, fortunately, a lot of information there in the form of tutorials and guides. Search engines are useless there. Most of the pages returned are indeed typical AI slop just there for the ad impressions.

It's extremely disheartening. I have no trust in Youtube staying accessible as a font of public knowledge. It just works out that way now.

Reddit seems hit or miss depending on the topic. Plenty of threads there where [deleted] asked a question and [disgruntled user] replied with something which has been replaced with random text by a fancy deletion tool.

godshatter|10 months ago

I generally use search.brave.com which has an integrative AI Assistant summary. Sometimes the summary does a nice job and other times I just skip it and go find a link that is from somewhere I recognize. If I want to know how to do something, I skip the summary. If I just want to know if something exists or is possible then the summary is sometimes enough. I have no real desire to replace my search engine usage with an LLM.

ergonaught|10 months ago

When Google's results are garbage I will sometimes ChatGPT or others. This is increasing, but that has more to do with Google producing ever worsening results than any desire to use LLMs to "search".

Google wants to show me products to buy, which I'm almost never searching for, or they're "being super helpful" by removing/modifying my search terms, or they demonstrate that the decision makers simply don't care (or understand) what search is intended to accomplish for the user (ex: ever-present notices that there "aren't many results" for my search).

Recently tried to find a singer and song title based on lyrics. Google wouldn't present either of those, despite giving it the exact lyrics. ChatGPT gave me nonsense until I complained that it was giving me worse results than Google, at which point it gave me the correct singer but the wrong song, and then the correct song after pointing out that it was wrong about that.

Still can't get Google to do it unless my search is for the singer's name and song title, which is a bit late to the party.

udev4096|10 months ago

Yes, why would I not? I, unlike you, do not intend to have a shallow knowledge on things I wanna know about. In a few years, it's going to get worse and no one would have deep expertise on anything (especially junior engineers) if they keep using LLMs. DDG is still far better than Google although I have started to see more ads on DDG searches which is quite annoying

nfriedly|10 months ago

It's a mix of both for me.

I use gemini more on my phone, where I feel like going through search results and reading is more effort, but I'll fall back to searching on duck duck go fairly often.

On a desktop I generally start at duck duck go, and if it's not there, then I don't bother with AI. (I use copilot in my editor, and it's usually helpful, but not really "search").

internet_points|10 months ago

Yes.

ddg is often faster for when I want to get to an actual web site and find up-to-date info, for "search as navigation".

llm's are often faster for finding answers to slightly vague questions (where you know you're going to have to burn at least as much climate on wading through blogspam and ads and videos-that-didn't-need-to-be-videos if you do a search).

austin-cheney|10 months ago

I used an AI tool for the first time this weekend to get a military CAC to authenticate to websites through Firefox on Arch. It took more than half a dozen uses of the AI tool to get what I was looking for though. Super edge case and even the AI struggled like a human.

Yes, I still use search engines and almost always find what I need in long form if I can’t figure it out on my own.

globnomulous|10 months ago

I never use ChatGPT for anything. I don't trust it for anything (nor should anybody), don't support the company that made it (unethically and on false pretenses as a nonprofit), and have absolutely no desire to contribute to its development.

When I need to search, I use a search engine and try to find a trustworthy source, assuming one is available.

ASalazarMX|10 months ago

Ironically, it's not that LLMs have become super useful, it's that the dominant search engines have become significantly worse, while at the same time they peddle AI results. It almost feels as if it was better for them if you used LLMs.

I won't deny LLMs can be useful, but they're like the news: double-check and form your own conclusions.

simonbw|10 months ago

I use Kagi to search, but I usually use it with the a "?" at the end which triggers an LLM response in addition to search results. It gives me the answer I want like 95% of the time, and I don't feel the need to dig into the search results. For me this tends to be way better than just searching or just using ChatGPT.

mbirth|10 months ago

I’m someone that grew up with AltaVista and thus I’m pretty good with my search terms and modifiers. And I often remember specific phrases from the websites I’m looking for. However, Google is more and more optimised for people NOT knowing what they’re looking for and is now even ignoring “quotes” for exact terms unless you switch it to verbatim mode. Which is a shame.

I’m mostly using my personal SearXNG instance and am still finding what I’m looking for.

On systems where I don’t have access to that, I’m currently trying Mojeek and experiment with Marginalia. Both rather traditional search engines.

I’m not a big fan of using LLMs for this. I rather punch in 3-5 keywords instead of explaining to some LLM what I’m looking for.

okayokayokay123|10 months ago

Switched over to DuckDuckGo a month ago. Results aren’t always great but it works 90% of the time.

I use perplexity pro + Claude a lot as well. Maybe too much but mostly for coding and conversations about technical topics.

It really depends on intent.

I have noticed that I’ve started reading a lot more. Lots of technical books in the iPad based on what I’m interested in at the moment.

cosmic_cheese|10 months ago

LLMs have taken up a significant share of my technical/programming questions, because there’s a pretty good chance it’ll give me a correct or mostly correct answer and if it doesn’t, the results aren’t catastrophic. I don’t trust them for much else though, and so I still use a search engine (Kagi) for most other things. For odd exceptions, I ask the LLM to cite its answers and in the event that it can’t do that or provides false citations, I fall back on search engines.

These tools are useful, but in my view the level of trust seemingly commonly being placed in them far exceeds their capabilities. They’re not capable of distinguishing confidently worded but woefully incorrect reddit posts from well-verified authoritative pages which combined with their inclination for hallucinations and overeagerness to please the user makes them dangerous in an insidious way.

add-sub-mul-div|10 months ago

The idea of taking an answer from any black box is profoundly unacceptable. Even if the black box didn't hallucinate. Why wouldn't I prefer to follow a link to a site so that I can evaluate its trustworthiness as a source?

Why would I want to have a conversation in a medium of ambiguity when I could quickly type in a few keywords instead? If we'd invented the former first, we'd build statues of whoever invented the latter.

Why would I want to use a search service that strips privacy by forcing me to be logged in and is following the Netflix model of giving away a service cheap now to get you to rely on it so much that you'll have no choice but to keep paying for it later when it's expensive and enshittified?

gwbas1c|10 months ago

I have Gemini results in my Google searches. They're "good enough" that I rarely venture to LLMs.

When I do, it's because either I can't think of good terms to use, and the LLM helps me figure out what I'm looking for, or I want to keep asking follow-up questions.

Even then, I probably use an LLM every other week at most.

DaSexiestAlive|10 months ago

Yes, ChatGPT has flaws (strange "hallucinations"?), but I found the same with me. Questions that I get no where with Google Search-n-friends (Duckduckgo/Qwant/Bing/etc etc) I had to give a last try with ChatGPT, and ChatGPT seems to fare considerably better.

Given my time dedicated to researching thing, I feel like I am "more productive" b/c I waste less time.

But I do my due diligence to double-check what ChatGPT suggests. So if I ask ChatGPT to recommend a list of books, I double-check with Goodreads and Amazon reviews/ratings. Like that. I guess it's like having a pair-research-sesson with an AI librarian friend? I am not sure.

But I know that I am appreciative. Does anyone remember how bad chatbots were before the arrival of low-hanging-AI-fruits like generative AI? Intel remembers.

mitthrowaway2|10 months ago

Often I remember having read an article or seen a website in ~2014 or something, and now I want to find a link to it so I can cite it. I use a search engine for this, typing in the gist of what I can remember, set a date range (more clicks than it should take), and that's how I get to it.

This can be very difficult, if there's a lot of semantic overlap with a more commonly-searched mainstream topic, or if the date-range-filtering is unreliable.

Sometimes I'll look for a recipe for banana bread or something, and searching "banana bread recipe" will get me to something acceptable. Then I just have to scroll down through 10 paragraphs of SEO exposition about how much everyone loves homemade banana bread.

Searching for suppliers for products that I want to buy is, ironically, extremely difficult.

I don't trust LLMs for any kind of factual information retrieval yet.

bythckr|10 months ago

I use search engine for 2 purpose and not sure if its a common practice across.

Specific search expecting 1 answer. These type search is enhanced by ChatGPT. Google is losing here.

Wild goose chase / brainstorming. For this, I need a broad set of answers. I am looking for a radically different solution. Here, today's Google is inferior to the OG Google. That is for 2 reasons.

1. SEOs have screwed up the results. A famous culprit is pinterest and many other irrelevant site that fill the first couple of pages.

2. Self-sensoring & shadow banning. Banning of torrent sites, politically motivated manipulation. Though the topic I am searching is not political, there is some issue with the result. I can see the difference when I try the same in Bing or DuckDuckGo.

wolrah|10 months ago

This thread is yet another thing that makes me fear for the future of humanity.

No, I don't use the hallucination machines to search, and I never will.

I use search engines to search. I use the "make shit up" machine when I want shit made up. Modern voice models are great for IVR menus and other similar tasks. Image generation models have entirely taken over from clipart when I want a meaningless image to represent an idea. LLMs are even fun to make up bogus news articles, boilerplate text to fill a template, etc. They're not search engines though and they can't replace search engines.

If I want to find real information I use a search engine to find primary sources containing the keywords I'm looking for, or well referenced secondary sources like Wikipedia which can lead me to primary sources.

AbraKdabra|10 months ago

Wow u mad bro, chill, OP just made a simple question.

axelthegerman|10 months ago

I can only imagine how much slower using an LLM would be, especially when it only gives you a single answer which is not what you're looking for and you have to keep asking for "something else"

I echo what others say, Kagi is a joy to use and feels just like Google used to be - useful

sReinwald|10 months ago

It depends on what I'm after. I still use regular searches quite a bit.

But a lot of my classic ADHD "let's dive into this rabbit hole" google sessions have definitely been replaced by AI deep searches like Perplexity. Instead of me going down a rabbit hole personally for all the random stuff that comes across my mind, I'll just let perplexity handle it and I come back a few minutes later and read whatever it came up with.

And sometimes, I don't even read that, and that's also fine. Just being able to hand that "task" off to an AI to handle it for me is very liberating in a way. I still get derailed a bit of course, but instead of losing half an hour, it's just a few seconds of typing out my question, and then getting back to what I've been doing.

nkrisc|10 months ago

Exclusively. I don't want to think of a question to ask, or think about phrasing some prompt so I get a useful result, I just want to through a few related words and terms into a search box and see where that gets me, and then use the results to refine my search terms further.

nh23423fefe|10 months ago

Those don't seem different to me. it seems like youve internalized the query syntax of search engines and youre fine with erroneous results.

runarberg|10 months ago

How do you know the information it generates is correct?

Just now for example I wanted to know how Emma Goldman was deported despite being a US citizen. Or whether she was a citizen to begin with. If an LLM gave me an answer I for sure would not trust it to be factual.

My search was simple: Emma Goldman citizenship. I got a wikipedia article, claiming it was argued that her citizenship was considered void after her ex husband’s citizenship was revoked. Now I needed to confirm it from a different source and also find out why her ex’s citizenship was revoked. So I searched his name + citizenship and got an New Yorker article claiming it was revoked because of some falsified papers. Done

If an LLM told me that, I simply wouldn’t trust it and would need to search for it anyway.

0xbadcafebee|10 months ago

I try to use Google. If I put my search question into the Android Firefox url bar and hit enter, Google will show up with some useful answers (if it's not in the AI answer, Google is useless, because there are 5 pages of bullshit before it begins to show me actual web page search results).

But if I then click the Google search text box at the top, and start typing, it takes 20 seconds for my text to start appearing (the screen is clearly lagged by whatever Google is doing in the background), and then somehow it starts getting jumbled. Google is the only web page this happens to.

I actually like their results, they just don't want me to see their results. Weird business model.

ruszki|10 months ago

LLMs are unreliable transformators for information which is already quite unreliable. So yes, I use Kagi. Averagely, using a search engine takes less time to achieve the same reliability (of course, perfect reliability is impossible). At least for me for sure.

mooiedingen|10 months ago

Absolutely, as the great Fravia+ RIP :( once said, it is in your advantage to know where and how to find possible solutions for your problems. And i am willing to even go so far as to say:

The more you thrust the models, the less cognitive load you are spending checking and verifiefing which will lead to what people call ai but which actually is nothing more than a for loop over in memory loaded data. That those who still think that: for Μessage in messages... can represent any sort of intelligence actually has already brainwashed on a new itteration of the "one armed bandit" where you click regenerate indefinatly with a random seed being distracted from what is going on around you

n_ary|10 months ago

I still use search engine. LLMs are great but often get outdated by 6-12 months. Usually, I search for random coding topics and asking LLMs will often reproduce the outdated top(most voted) answer verbatim. However, there are some nice folks at SO/SE who come back later and update the answers or submit a new one, but LLMs often don’t return these but continue producing various spin off of the top answer by modifying variable names or adding/removing comments.

Hence, search still remains my hope until SO and the likes decay.

Additionally, many search engines now already generate quick summaries or result snippets without a lot of prompt-fu, hence LLMs have actually become 40:60(llm:search) ratio day to day.

Shorel|10 months ago

I still use search engines, and not ChatGPT or any LLM as my primary behaviour.

Of course, I have used Phind and other LLMs, and the results sometimes are useful, but in general the information they give back feels like a summary written for the “Explain Like I'm Five” crowd, it just gives me more questions than answers, and frustrates me more than it helps me.

Where LLMs excel is when I don't know the exact search term to use for some particular concept. I ask the LLM about something, it answers with the right terms I can use in a search engine to find what I want, then I use these terms instead of my own words, and what I want is in the search results, in the first page.

alkonaut|10 months ago

ChatGPT takes 5-10 seconds to respond. Until it's as fast as google, i'm not switching.

The question is: are you searching for answers to something, or are you searching for a site/article/journal/whatever in order to consume the actual content? If you are searching for a page/article/journal/ in order to find an answer, then the journal/article itself was just a detour, if the LLM could give you the answer and you could trust it. But if you were looking for the page/article itself, not some piece of information IN the article then ChatGPT can (at best) give you the same URL google did, but 100x slower?

wenbin|10 months ago

Yes, if I need (relatively) accurate answers (with the sources / urls / web pages), I'd use keyword search on Google.

Still have a trust issue with LLM/ChatGPT for facts. Maybe in a couple years my mindset will shift and trust LLM/chatgpt more.

elseleigh|10 months ago

I switched from Google to StartPage twelve years ago and have seen no need to change. I have trialed Kagi, and would move there if Startpage became unreliable. I've not used any LLM as a search engine alternative, and I have no plans to do so.

bbyford|10 months ago

I use duckduckgo and google for search – sometimes I find the google AI answers helpful (even though sometimes inaccurate still) for more categorical questions that I simply need an anwser or somewhere to start with (i.e. "can you eat charcoal?"), I then go ahead a click some links...

I use ChatGPT for text summation and translation, and midjourney for slide decks and graphic design ideation.

wodenokoto|10 months ago

Constantly use search. Using chatgpt exclusively is like those kids that only use tiktok

kryptiskt|10 months ago

No, I judge ChatGPT by the same standards as I judge humans. It's an inveterate liar, and I much prefer to deal with trustworthy sources of information even if they are slightly less convenient than that smarmy bullshitter.

jacobgkau|10 months ago

I still primarily use search engines like Brave Search, DuckDuckGo, Bing, and Google (in that order). I've started sometimes bothering to read the search engines' AI overviews instead of skipping them, although I almost always still click through to their sources for any particular statement.

I just tried ChatGPT and saw that you can ask it to search the web and also can see its sources now. I still remembered how it was last time I used it, where it specifically refused to link out to external sources (looks like they changed it around last November). That's a pretty good improvement for using it as search.

lovehashbrowns|10 months ago

Search has gotten so bad I have replaced like 80% of it with LLMs, typically Claude or Gemini. I've also switched my searches over to duckduckgo whenever I do end up searching for something but even that is on the bad side.

bflesch|10 months ago

Kagi is a very good alternative to google. When you're actually doing some research and have an exhaustive look at search results, kagi provides much more detailed results than google.

I'd rank kagi > chatgpt > google any day.

ttctciyf|10 months ago

If I was habitually asking some llm for nuggets of information I'd have to use web search to verify it in any case.

But in fact I overwhelmingly use search over llm because it's an order of magnitude quicker (I also have google search's ai bobbins turned off by auto-using "web" instead of "all".)

I've used llm "for real" about 3 times in the last two months, twice to get a grounding in an area where I lacked any knowledge, so I could make better informed web searches, and once in a (failed) attempt to locate a piece of music where web search was unsuccessful.

renegat0x0|10 months ago

I use mixture of solutions for Web browsing

- I use RSS to see 'what's new', and to search it. My RSS client support search

- I maintain list of domains, so when I want to find particular place I check my list of domains (I can search domain title, description, etc.). I have 1 million of domains [0]

- If I want more precise information I try to google it

- I also may ask chatgpt

So in fact I am not using one tool to find information. I use many tools, and often narrowing it down to tools that most likely will have the answer.

[0] https://github.com/rumca-js/Internet-Places-Database

AndrewDucker|10 months ago

Yup. I want to see a variety of sources, evaluate them, and understand the answer.

laweijfmvo|10 months ago

I just very recently tried using ChatGPT for a situation where’d I’d typically search, draw some conclusions, search again, etc. Basically, planning an open ended vacation.

The biggest issue is when GPT returns something that doesn’t match your knowledge, experience, or intuition and you ask the “are you sure?” question, it seems to inevitably come back with “you’re right!”. But then why/how did it get it wrong the first time? Which one is actually true? So I go back to search (Kagi).

So for me, LLMs are about helping to process and collate large bodies information, but not final answers on their own.

yieldcrv|10 months ago

ChatGPT was 12 months ago for me

I use Claude pretty exclusively, and GPT as a backup because GPT errors too much and tries to train on you too much and has a lackluster search feature. The web UIs are not these company’s priority, as they focus more on other offerings and API behavior. Which means any gripe will not be addressed and you have to just go for the differentiating UX.

For a second opinion from Claude, I use ChatGPT and Google pretty much the same amount. Raw google searches are just my glorified reddit search engine.

I also use offline LLM’S a lot. But my reliance on multimodal behavior brings me back to cloud offerings.

Adachi91|10 months ago

Agreed, I also use ChatGPT now mostly for searches, because it will pick the best sources that are not content farms, so I don't have to look through garbage to get a result. I started doing this about a year ago and was like "Oh, wow. This could disrupt search engines" and refer to it as "Super Google" now. I always have it link me to the source of what I'm looking for so I don't worry about it hallucinating, for common information I'm looking for reputable sources but don't know which `X` source has them for `Y` info.

johnny_canuck|10 months ago

When searching something non-programming related, I do. For example, I'm building an addition on our home. Searching for building materials, ideas, and any building science questions I have, I often find LLMs lacking. Even then, maybe 40% of the time Gemini gives me a good enough response.

On the flip side, any time I'm searching for something programming (FE, JavaScript in my case) it's last resort because an LLM is not giving me the answer I'm looking for.

This is still shocking to me, I really never thought I would replace my reliance on Google with something new.

sans_souse|10 months ago

I will use search as part of any research based learning so long as it remains a functional option. So long as there's some chance of the AI giving incorrect data where precision is necessitated to learn the fact, I will remain leary and manually search those specific areas later on. At the very least.

Operator words still do work in google, albeit less so than in the past - they still do the job.

I see the AI as being there to do the major leg work. But the devil's in the details and we can't simply take their word that something is fact without scrutinizing the data.

Kostic|10 months ago

I've stopped using search engines actively 18 months ago. My first stop is an LLM. Once I understand what I actually need, I do a web search to go to the product/tool website. I do this not because LLMs are that good but because web search result quality went way down in the same period.

One interesting trend that I like is that I started using local LLMs way more in the last couple of months. They are good enough that I was able to cancel my personal ChatGPT subscription. Still using ChatGPT on the work machines since the company is paying it.

hnlurker22|10 months ago

Search engines are up to date. I can search something that happened today. LLMs are several years behind. Until that's fixed i think we'll still be using the once very useful search engines

sfblah|10 months ago

I've been tracking this (literally on paper). I've moved around 75% of my queries to LLMs from search engines. And, the main reason I use search engines is because of queries on mobile, where the devices still make it much easier to search with a traditional search engine; and the omnibar.

Keep in mind that I'm not counting in my 75% queries where I get my answer from Google Gemini I'm just guessing if you added that in, it would rise to 85-90%.

My thought is if browsers and phones started pushing queries over to an LLM, search (and search revenue) would virtually disappear.

d4mi3n|10 months ago

I think this is mainly a symptom of poor alignment between search engines and their customers. ChatGPT works well now but the plan there seems to be to monetize search-like queries. I fear it won't be long until chat AI agents using this model bring us back to the same frustrations we have today with Google et all.

There is some room for optimism, though. There's been a rise in smaller search engines with different funding models that are more aligned with user needs. Kagi is the only one that comes to mind (I use it), but I'm sure there are others.

j-bos|10 months ago

I find LLMs good for general knowledge, and clever rubber ducking, but if I have a very specific niche issue in some sort of language or framework, generally search is a better bet to find a stack overflow post of people going through the exact same thing I went through. And even if they don't have solutions, they'll usually link you to more valuable and specific references that you can use.

Though lately for more in-depth research I've been enjoying working with the LLM to have it do the searching for me and provide me links back to the sources.

holografix|10 months ago

Everyone still uses search for news, fact checking and as a nat language dns (eg: website for golang) The two worlds are en route for a merger and Google will most likely come out ahead.

That’s if they can swing the immense ads machine (and by that I mean the ads organisation not the tech) and point it at a new world and a different GTM strategy.

They still haven’t figured out how to properly incentivise content producers. A lazy way would be to display ads that the source websites would display alongside the summary or llm generated response and pass on any CPM to the source.

JamesAdir|10 months ago

I've recently tinkered with creating an automated install script for a home server. It was good practice for me and I want to setup a small home server with pihole, sonarr and so on. I've created it with Claude and ChatGPT and both preformed poorly. Huge chunks of code, that might be running, but creates much more mess than it should. Only after going and reading the documentation the old fashioned way and with the help of search, I was able to reduce the size of the script and solve many problems with it.

booleandilemma|10 months ago

Yes, because I still can't trust the output from LLMs, at all, really.

runjake|10 months ago

I'll use Claude about 75% of the time, and then a search engine about 25% of the time. That 25% of the time, I'm usually looking for:

- Specific documentation

- Datasets

- Shopping items

- Product reviews

But for the search engines I use, their branded LLM response takes up half of the first page. So that 25% figure may actually be a lot smaller.

It's important to note that these search engine LLM responses are often ludicrously incorrect -- at least, in my experience. So now I'm in this weird phase where I visit Google and debate whether I need to enter search terms or some prompt engineering in the search box.

Saris|10 months ago

ChatGPT is frequently wrong with its answers, so yes search and forums and websites are still the best option.

For example I asked it about rear springs for a 3rd gen 4runner and it recommended springs for a 5th gen.

jerejacobson|10 months ago

I recently had someone reach out and tell me they liked a Chrome extension that I built. They found it by asking ChatGPT if there was a way to do X on Y, and it recommended my extension to them.

I was very surprised to hear this, and it made me wonder how much of traditional SEO will be bypassed through LLM search results. How do you leverage trying to get ranked by an LLM? Do you just provide real value? Or do you get featured on a platform like Chrome Extensions Store to improve your chances? I don't know, but it is fun to think about.

nsluss|10 months ago

The category of search I've stopped doing is the one where I'd append "reddit" to the end. The models are going to do a great job of distilling a wide range of opinion into something super digestible better than the old flow of looking for a bunch of threads on the subject from expert-amateurs and having to read them all myself.

For the people who say they've reduced their search engine use by some large percentage, do you never need to find a particular document on the web or look for reference material?

d1an|10 months ago

My native language is Chinese. Most of my colleague use www.baidu.com as search engine. But I do not like baidu, because the search results are full of ADs. I also use ChatGPT or deepseek. But in my work (linux kernel driver portting), the AI is not good enough, I can not believe it totally. So for some case, I still use Google to search answer with keyword in English. If you want to konw why I use English as keywords, because of CSDN,this site has been polluting the internet for far too long.

GuinansEyebrows|10 months ago

Yes, of course, because i find it way more informative to search in broad terms, digest varying sources and arrive at conclusions rather than by asking a question that invariably lacks enough context (that I’d find by reading docs or SO posts) to actually produce helpful results, let alone a deeper understanding of the topic than before i started.

Learning is fun! Reading is good for you! Being spoon fed likely-inaccurate/incomplete info or unmaintainable code is not why i got into computers.

locallost|10 months ago

It's become kind of tempting to use chatgpt for that because you don't have to search yourself for that one post somewhere that describes what you're looking for. But I found little use for anything critical because it's just wrong way too often. Recently it gave me the info on how to use an API which turned out to be deprecated since two years. But for setting up my parents iPad where I was looking for a setting I couldn't find, it's fine.

JimT777|10 months ago

I refuse to use AI. Artificial Intelligence enables and encourages Natural Stupidity.

degrees57|10 months ago

I switched my default search engine in my browser to perplexity.ai a few months ago and am super happy with it. The only time I use Google anymore is to specifically visit www.google.com and put site:example.com in the search field, when I know the results I am looking for are only found within that site. I've only had to do that five or six times in the last few months.

And yes, just plain old Google search is completely lackluster in comparison to the perplexity.ai search I get to do today.

llm_nerd|10 months ago

Almost always use search-augmented LLMs now, to infinitely better results. Whether I'm wondering about a movie or looking for information on a programming language feature, or even specifics about niche things, an LLM gets me there much quicker.

Earlier today I was trying to remember the name of the lizard someone tweeted about seeing in a variety store. Google search yielded nothing. Gemini immediately gave me precise details of what I was talking about, it linked to web resources about it.

grishka|10 months ago

Of course I do and always will. Not once did I consider using an LLM for anything even remotely serious. Generative AI, in any form, is a toy, a novelty. It's fun to play with and make fun of sometimes, but that's about it. And honestly, I'm tired as hell of generative AI being this new hotness that everyone must stick somewhere somehow. In their products, in their workflows, in their lives. I'm so looking forward for this fad to pass.

mancerayder|10 months ago

I've become very very bad at Google searches. Nothing seems accurate anymore, I should say precise, I'm hitting vendor/official/party-line stuff, or wordy blogs that say nothing.

I use ChatGPT at home constantly, for history questions, symptoms of an illness, identification of a plant hiking, remembering a complex term or idea I can't articulate, tips for games, and this list goes on.

At work it's Copilot.

I've come to loathe and mock Google search and I can't be the only one.

linacica|10 months ago

Depends on content, sometimes i use GPT to find stuff im lazy for and i know google would waste my time more likely, but generally i still use google, there are a lot of miscellaneous searches where an LLM would do worse than a search engine (currency exchange rate, stock price, quick facts etc..) tho I wish google had an option to block some sites from showing up, some searches are just filled with garbage - and i would like to block the whole domain from ever showing up

lo_fye|10 months ago

I was primarily using ChatGPT & Perplexity. Then I started calling B.S. on them, and more often than not their replies were "You're right! Sorry about that!" Simply saying "that's wrong" reveals a terrifying amount of "hallucination" by AI. Far, far more answers stated confidently as fact, turned out to be completely made up.

If I want to play with ideas, I chat with AI. If I need facts, I use search.

noer|10 months ago

It depends on what I'm looking for. If I have a specific thing that I'm just looking for an answer on, then I typically will use ChatGPT. Most of my google searches are either navigational, things i know Google will return more quickly than ChatGPT ("how old is this actor", "when was xx player drafted") or when I'm interested in browsing results (looking for recipes for borscht, I want to see a few different recipes).

satisfice|10 months ago

I don’t trust LLMs for search and neither should anyone. I speak as a professional tester. They are essentially untested. It should offend us all that OpenAI puts such unreliable software out there.

Unlike Google, or Duck Duck Go, which serve up links that we can instantly judge are relevant to us, LLM spin stories that sound pretty good but may be and often are insidiously wrong. It’s too much effort to fact check them, so people don’t.

shmerl|10 months ago

Yes. But I noticed a trend of people asking stuff like "why doesn't this work" in various community forums, which ends up being them sourcing the method from the likes of ChatGPT / Gemini, etc. Lesson - don't do that, especially when you are going to be wasting others' time on explaining why things didn't work. Search things properly. Read documentation. Even if you use AI, never trust its results.

jonathanstrange|10 months ago

I don't use ChatGPT or any other AI. If I search for something, I search for authoritative documents on the topic. That is, official docs, articles, books. Asking ChatGPT would be like asking a random person who provides an opinion without any guarantees. It's potentially useful information but needs to be verified. So I need to search again to verify the information and get to an authoritative source.

InfiniteLoup|10 months ago

Almost all of my “searches” are now done by either ChatGPT or Claude.

I'm still using Google for searches on Reddit these days because Reddit's own search engine is terrible.

nullbio|10 months ago

Only a matter of time before OpenAI starts selling advertising services and weighting specific websites and services higher than others in their response generation for money. We should really outlaw this before it becomes a problem, and before it becomes a thing, because once it happens there is no turning back. Unlikely that anyone with the power to do this will actually have the foresight to do it though.