top | item 35988224

(no title)

bitL | 2 years ago

How are they going to inject ads there though? Moreover, the cost per ad will be massive given how expensive the inference is. And a self-hosted Vicuna will likely behave similarly, potentially rendering the whole search experience pointless in the future. Anyone can incorporate a bunch of DDG results to a self-hosted good-enough LLM as well and make a basic desktop app for it, bypassing Google completely.

discuss

order

summerlight|2 years ago

> Moreover, the cost per ad will be massive given how expensive the inference is.

The existing system still can do a nice job to predict value per query, so they'll apply it to perhaps only 1~5% of queries. It's still expensive though, but acceptable if the additional cost is meant to be transient over 1~2 years.

> Anyone can incorporate a bunch of DDG results to a self-hosted good-enough LLM

Perhaps (much) less than 0.01% of population? Technically, anyone can self-host their own blog but not many do that.

danans|2 years ago

> Moreover, the cost per ad will be massive given how expensive the inference is.

Why would they need to use LLM inference to show ads? If anything, they can use their standard efficient methods to match ads based on the output of the LLM. Generative Search doesn't look like it's raw LLM output anyway. It's obviously been trained to pull from structured data sources or at least it's output is post-processed to present that data.