Ask HN: How many AI startups are just OpenAI/Anthropic/etc. API calls with a UI?
16 points| Gshaheen | 1 year ago
Some questions I'm interested in: - How can you tell if a company is primarily using foundation model APIs? - What percentage of AI startups fall into this category? - Are there examples of companies doing this particularly well or poorly? - What constitutes legitimate value-add on top of foundation models?
muzani|1 year ago
But we're seeing companies like Cursor which pay a lot of attention to how the models interact with everything. They're not just prompting the AI, they index files, they search and mimic styles. You can @ a certain file to use it as a reference. The composer is autonomous, even extracts commands from the AI to run, or extracts only the code needed. And it double checks that what it's writing is true. There's a whole system in there, it represents more like a modern car with proper driveshaft, gears, pedals and stuff. Copilot still feels like a box on four wheels attached to an engine.
Perplexity was a wrapper earlier on and arguably it might still be. But they've used AI more effectively than Google to figure out what a person is actually trying to search for, and suggest those things.
I think most of these companies will have minimal value added at the start and slowly start to refine it. It's hard to say what percentage.
scarface_74|1 year ago
The Perplexity UI feels like something from the early 2000s
rcarmo|1 year ago
I would go into a demo, look at it, ask them how they did RAG on the data, ask to speak to the people doing their AI models, etc. And then sometimes I would spend a couple of hours wiring up a Node-RED flow to show my colleagues how trivial it was.
The stuff we (now they, since I'm taking some time "off" that moonlighting gig to recover from burnout) ended up prioritizing are companies that focus on business processes where triage and "human augmentation" can actually benefit from LLM summarization, some automated decision making, and some data "integration" (not just summarization, but broad correlation of events, etc.)
There's a _lot_ to be done in many fields where, say, you will notice a spike in some piece of data (using conventional ML), gather the data around that event and present it to a human (with pros and cons, including trying to flag if the data is reliable). Think GitHub issues for crop management, and you're halfway there.
_Those_ companies really need to have their use cases sorted out, and not just try to be "the Uber for greenhouse management" because they wrapped weather forecasts into an LLM.
So, in short, the real added value is expediting or improving use of domain expertise captured from (or still held by) humans.
Gshaheen|1 year ago
I agree here, AI-infused automation/orchestration of sometimes long-running business processes with a human in the loop seems to be very high value.
jarsin|1 year ago
The latest ycombinator AI promo video on youtube makes it seem like few are getting in anymore unless they are building around AI. And they strongly push that startups should be hiring developers who use AI tools on top of that.
yamirghofran|1 year ago
iiJDSii|1 year ago
But can anybody name a wrapper company as/more valuable than foundation model companies like OpenAI, DeepSeek, Anthropic, Mistral, etc?
Only one that comes to mind is Perplexity, but they're a bit more than a wrapper startup - I think there's some hardcore engineering to get their web search product working so well.
andrewfromx|1 year ago
Its almost like you need a full time AI researcher on team, not to help build, but to keep up on all new info and constantly tell the team PIVOT!
karmakaze|1 year ago
linotype|1 year ago
iiJDSii|1 year ago
I think it's a non-sequitur - either companies are much more than just UIs around a DB, or those companies just aren't very valuable (in which case, who cares).
pillefitz|1 year ago
unknown|1 year ago
[deleted]
achempion|1 year ago
unknown|1 year ago
[deleted]