(no title)
shaism | 1 year ago
My take: Today, using AI for search-related problems is still not cost-effective for most use cases. That being said, the landscape is evolving quickly. First, in some areas, an individual search creates more value than in others. An individual consumer doing a Google Search is totally different from a lawyer searching for reference material. Areas where the individual search creates more value can already benefit from AI today. Second, LLMs become exponentially cheaper, driven by more cost-effective computing but also more cost-effective models. Look at the pricing of GPT4o-mini vs GPT4 (the original). The models are comparable in performance for many search-related problems, but the price has decreased by 200x in 1.5 years ($0.15 vs $30 per 1M token). If that price trend continues, more and more search use cases will benefit from AI.
No comments yet.