top | item 42009370

(no title)

Willamin | 1 year ago

I find myself being unable to search for more complex subjects when I don't know the keywords, specialized terminology, or even the title of a work, yet I have a broad understanding of what I'd like to find. Traditional search engines (I'll jump between Kagi, DuckDuckGo, and Google) haven't proved as useful at pointing me in the right direction when I find that I need to spend a few sentences describing what I'm looking for.

LLMs on the other hand (free ChatGPT is the only one I've used for this, not sure which models) give me an opportunity to describe in detail what I'm looking for, and I can provide extra context if the LLM doesn't immediately give me an answer. Given LLM's propensity for hallucinations, I don't take its answers as solid truth, but I'll use the keywords, terms, and phrases in what it gives me to leverage traditional search engines to find a more authoritative source of information.

---

Separately, I'll also use LLMs to search for what I suspect is obscure-enough knowledge that it would prove difficult to wade through more popular sites in traditional search engine results pages.

discuss

order

layer8|1 year ago

> I find myself being unable to search for more complex subjects when I don't know the keywords, specialized terminology, or even the title of a work, yet I have a broad understanding of what I'd like to find.

For me this is typically a multi-step process. The results of a first search give me more ideas of terms to search for, and after some iteration I usually find the right terms. It’s a bit of an art to search for content that maybe isn’t your end goal, but will help you search for what you actually seek.

LLMs can be useful for that first step, but I always revert to Google for the final search.

Also, Google Verbatim search is essential.

extr|1 year ago

Yeah this is exactly how I use LLMs + Google as well. I would even go further and say that most of the value of Google to me is the ability to find a specific type of source by searching for exact terminology. I think AI search is fatally flawed for this reason. For some things generic factual information is okay ("What's the capital of France?") but for everything else, the information is inextricably bound up with it's context. A spammy SEO blog and a specialist forum might have identical claims, but when received from the latter source it's more valuable, it's just higher signal.

Google used to care about this but no longer does, pagerank sucks and is ruined by SEO, but it still "works" because if you're good you can guess the kind of source you're looking for and what keywords might surface it. LLMs help with that part but you still need to read it yourself, because they don't have theory of mind yet to make good value judgements on source quality and communicate about it.

erosivesoul|1 year ago

I also find some use for this. Or I often ask if there's a specific term for a thing that I only know generally, which usually yields better search results, especially for obscure science and technology things. The newer GPTs are also decent at math, but I still use Wolfram Alpha for most of that stuff just because I don't have to double check it for hallucinations.

niutech|1 year ago

You can try Brave Search, which provides classic SERP as well as AI answer.

Lws803|1 year ago

You might like what we're building in that sense :D (full disclosure, I'm the founder of Beloga). We're building a new way for search with programmable knowledge. You're essentially able to call on search from Google, Perplexity other search engines by specifying them as @ mentions together with your detailed query.