I think you’re welcome to that opinion and are far from alone but (1) I am very happy to pay for Claude, even $200/mo is worth it and (2) idk if people just sort of lose track or what of how far things have come in the span of literally a single year, with the knowledge that training infra is growing insanely and people are solving on fundamental problem after another.
ModernMech|3 months ago
The pool of people willing to pay for these premium services for their own sake is not big. You've got your power users and your institutional users like universities, but that's it. No one else is willing to shell out that kind of cash for what it is. You keep pointing to how far it's come but that's not really the problem, and in fact that makes everything worse for OpenAI et al. Because, as they don't have a moat, they don't have customer lock-in, and they also soon will not have technological barriers either. The models are not getting good enough to be what they promise, but they are getting good enough to put themselves out of business. Once this version of ChatGPT gets small enough to fit on commodity hardware, OpenAI et al will have a very hard time offering a value proposition.
Basically, if OpenAI can't achieve AGI before ChatGPT4-type LLM can fit on desktop hardware, they are toast. I don't like those odds for them.