(no title)
picardo | 3 months ago
With an LLM, the inference cost per query is orders of magnitude higher. Unless thy have a way to command significantly higher CPMs -- perhaps by arguing intent signal is stringer in a conversation than a keyword search -- it feels like a difficult margin to sustain.
No comments yet.