top | item 43208651

(no title)

vecter | 1 year ago

I've noticed that since I've started using ChatGPT, I've almost entirely stopped using Google (except for the rare case where I need a specific website but don't remember the URL). In addition to a bunch of technical questions related to my work, my ChatGPT chat log has the most mundane things like:

  - What is platos frios
  - Can you download Netflix videos to your local device
  - Who composed the Top Gun theme
  - Who have been the most successful American Idol winners
  - If I check-in the day before a United Airlines flight, can I still buy additional checked bags when I go to the airport
  - If I'm buying a Schwinn IC4 indoor spin bike, do I need a floormat for it also
  - What is pisco 
  - In the US, what is the format for EINs?
  - Is it bad to use tap water in your humidifier?
  - Which NBA players are on supermax contracts
  - What are some of the best steakhouses in Manhattan?
  - How much and how long does it take to procure a DUNS number?
  - In terms of real estate, what is historic tax credit development
LLMs give me the answers I want immediately. Before, I would use Google basically as a proxy to find websites that I'd then have to sift through to find the answers to these questions. It was another layer of indirection. Now that I can have an LLM just tell me the answer (you still need to approach it with a skeptical eye, since it can certainly get some things wrong), I don't need to "search" the search results pages themselves and read multiple articles and blog posts to hopefully find the answer to my question.

discuss

order

drdaeman|1 year ago

The problem with this approach is that LLM gives me unreliable answers. I know this because sometimes I ask things that I used to knew but forgot and needed to refresh my memory - and sometimes the answers were incorrect. So, unfortunately, a search engine validation step is still a necessity.

Asking LLM to provide a link does NOT work, as they hallucinate links just fine, and give links that are either broken or do not contain the information LLM says it should. Using search tools through a LLM (like ChatGPT's "search" function) sort of works (at least the link will be correct - still need to check if the contents means what LLM says it does), but it's quite limited and cannot be fine-tuned (I don't use Google but rather prefer Kagi, and I tend to heavily rely on Kagi's lenses, site: queries and negative terms to scope and refine searches).

In other words: please do NOT trust LLM's answers, even if they sound plausible. Always verify.

scarface_74|1 year ago

You aren’t using the paid version of ChatGPT are you? It does a real time search of the web for links.

mitthrowaway2|1 year ago

After some experience and testing, I've become well aware not to use LLMs to ask questions like "who did X" and "what is company Y's policy about Z", because they tend to hallucinate responses (even for well-known people).

What I've not yet figured out how to deal with is how to handle being surrounded by a society of people who go ahead and trust LLMs for their factual answers anyway. I think even if I'm careful about selecting my sources, the background noise floor is going to climb up to the point that there's no signal-to-noise ratio left.

Thorentis|1 year ago

People used to criticise Wikipedia for being bad due to being crowdsourced (at least in school they did). Now, Wikipedia looks like one of the best antidotes to LLMs.

nitwit005|1 year ago

I suspect this is more of a case of garbage-in garbage-out. The existing web results have invented answers.

People created websites to "answer" people's search queries about celebrity net worth, if some celebrity is gay, if they are in a relationship, etc. They obviously frequently did not know, and made a guess, or relied on tabloids as a source, who also frequently make things up.

milesrout|1 year ago

Does it matter to you whether the answers you are given are correct? Google results are sometimes wrong but the web gives you signals about reliability, like the author, etc. If I want to know who wrote a paper, I can google the paper's name and get an ACM page about the paper or a PDF of the paper and read the author's name. Very reliable. If ChatGPT says the name I have no clue if it is right.

Multiple sources is a good thing. Using just ChatGPT is like only ever using Wikipedia as a source of all information, but put through a filter that removes all sources and attribution information and cross linking and history and those notices at the top of pages saying the article has issues AND normalising the writing style so that you can't even use bad spelling and grammar as a signal of inaccuracy.

RandallBrown|1 year ago

I really like having the site where the answer came from so I can instantly judge how likely the answer is to be correct.

Chat GPT does correctly answer your question about airline bags, but I have no way of knowing if it made that answer up or not because so many airlines have the same policy.

Google at least gives you links to the United baggage policies. The AI overview in Google also "cites its sources", which sort of gives you the best of both worlds. (I'm sure the accuracy of Google's AI vs. ChatGPT is up for debate)

hunter-gatherer|1 year ago

I might misunderstand, but can't you just ask for the reference? I've also been using (Gemini) a lot to basically replace my search engine, but I always tell it to give me a reference. I've had pretty good results with this approach.

moi2388|1 year ago

Gpt has web search with links to the website.

boilerupnc|1 year ago

I wonder how many new, strange, surprising and wonderful things you indirectly stumbled into during those sifting exercises. Hyper-optimized search has some downsides. I love getting answers to my specific questions, but that always encompasses the "known unknowns" space. Through skimming and sifting using websites as proxies, I enjoyed surprises from the "unknown unknowns" space.

hunter-gatherer|1 year ago

10 years ago I'd agree with you completely. I definitely get your point and share some of that same sentiment, but search results these past 10 uears have become overwhelming absurd, shallow, and barely tangentially related to what I'm looking for

upcoming-sesame|1 year ago

Same for me. The only thing I still use Google for is for up to date data as LLMs are not great with that yet