(no title)
vecter | 1 year ago
- What is platos frios
- Can you download Netflix videos to your local device
- Who composed the Top Gun theme
- Who have been the most successful American Idol winners
- If I check-in the day before a United Airlines flight, can I still buy additional checked bags when I go to the airport
- If I'm buying a Schwinn IC4 indoor spin bike, do I need a floormat for it also
- What is pisco
- In the US, what is the format for EINs?
- Is it bad to use tap water in your humidifier?
- Which NBA players are on supermax contracts
- What are some of the best steakhouses in Manhattan?
- How much and how long does it take to procure a DUNS number?
- In terms of real estate, what is historic tax credit development
LLMs give me the answers I want immediately. Before, I would use Google basically as a proxy to find websites that I'd then have to sift through to find the answers to these questions. It was another layer of indirection. Now that I can have an LLM just tell me the answer (you still need to approach it with a skeptical eye, since it can certainly get some things wrong), I don't need to "search" the search results pages themselves and read multiple articles and blog posts to hopefully find the answer to my question.
drdaeman|1 year ago
Asking LLM to provide a link does NOT work, as they hallucinate links just fine, and give links that are either broken or do not contain the information LLM says it should. Using search tools through a LLM (like ChatGPT's "search" function) sort of works (at least the link will be correct - still need to check if the contents means what LLM says it does), but it's quite limited and cannot be fine-tuned (I don't use Google but rather prefer Kagi, and I tend to heavily rely on Kagi's lenses, site: queries and negative terms to scope and refine searches).
In other words: please do NOT trust LLM's answers, even if they sound plausible. Always verify.
scarface_74|1 year ago
mitthrowaway2|1 year ago
What I've not yet figured out how to deal with is how to handle being surrounded by a society of people who go ahead and trust LLMs for their factual answers anyway. I think even if I'm careful about selecting my sources, the background noise floor is going to climb up to the point that there's no signal-to-noise ratio left.
Thorentis|1 year ago
nitwit005|1 year ago
People created websites to "answer" people's search queries about celebrity net worth, if some celebrity is gay, if they are in a relationship, etc. They obviously frequently did not know, and made a guess, or relied on tabloids as a source, who also frequently make things up.
milesrout|1 year ago
Multiple sources is a good thing. Using just ChatGPT is like only ever using Wikipedia as a source of all information, but put through a filter that removes all sources and attribution information and cross linking and history and those notices at the top of pages saying the article has issues AND normalising the writing style so that you can't even use bad spelling and grammar as a signal of inaccuracy.
RandallBrown|1 year ago
Chat GPT does correctly answer your question about airline bags, but I have no way of knowing if it made that answer up or not because so many airlines have the same policy.
Google at least gives you links to the United baggage policies. The AI overview in Google also "cites its sources", which sort of gives you the best of both worlds. (I'm sure the accuracy of Google's AI vs. ChatGPT is up for debate)
hunter-gatherer|1 year ago
moi2388|1 year ago
boilerupnc|1 year ago
hunter-gatherer|1 year ago
upcoming-sesame|1 year ago
unknown|1 year ago
[deleted]