top | item 37842047

(no title)

namelosw | 2 years ago

The problem is that the industry has't figured out how to properly pricing LLM applications yet.

Before LLM, it's perfectly possible to spin up a SaaS on 5$ Digital Ocean VM and charge $4.99 per seat monthly. If you're using low overhead techs like Go and SQLite you might get away pretty far with a decent user base.

But LLM is inheirently costly compared to those traditional apps. No matter if you're calling OpenAI or DIY your own GPU cluster it's gonna be way more expensive. Spin your own GPU might ended to be more expensive because utilization problems and upfront costs.

The subscription model was kind of the silver-bullet for SaaS but it's probably not going to work well in the AI era.

OpenAI, Elevenlabs, Runway, and Midjourney: they have subscription model but the quota is strict and tight. The "unlimited" plan is simply pay-as-you-go.

Early wave of LLM products with unlimited subscription models like Github Copilot and Notion AI are probably pricing way too low. $7 or $10 is way too low to support heavy usage.

But charging $50 might scare most user away because it exceeded people's expectation for SaaS. And probably still end up losing money. And hobby users may ended up paying too much for the core users - that will lead us back to sophisticated pricing tiers like Elevenlabs and Runway.

Are there alternatives? I dunno. Maybe implement bring-your-own-key properly? Like OAuth but for LLMs? It's definitely interesting to see how things will turn out eventually.

discuss

order

No comments yet.