Show HN: Metorial (YC F25) – Vercel for MCP
59 points| tobihrbr | 4 months ago |github.com
The Problem: While MCP works great locally (e.g., Cursor or Claude Desktop), server-side deployments are painful. Running MCP servers means managing Docker configs, per-user OAuth flows, scaling concurrent sessions, and building observability from scratch. This infrastructure work turns simple integrations into weeks of setup.
Metorial handles all of this automatically. We maintain an open catalog of ~600 MCP servers (GitHub, Slack, Google Drive, Salesforce, databases, etc.) that you can deploy in three clicks. You can also bring your own MCP server or fork existing ones.
For OAuth, just provide your client ID and secret and we handle the entire flow, including token refresh. Each user then gets an isolated MCP server instance configured with their own OAuth credentials automatically.
What makes us different is that our serverless runtime hibernates idle MCP servers and resumes them with sub-second cold starts while preserving the state and connection. Our custom MCP engine is capable of managing thousands of concurrent connections, giving you a scalable service with per-user isolation. Other alternatives either run shared servers (security issues) or provision separate VMs per user (expensive and slow to scale).
Our Python and TypeScript SDKs let you connect LLMs to MCP tools in a single function call, abstracting away the protocol complexity. But if you want to dig deep, you can just use standard MCP and our REST API (https://metorial.com/api) to connect to our platform.
You can self-host (https://github.com/metorial/metorial) or use the managed version at https://metorial.com.
So far, we see enterprise teams use Metorial to have a central integration hub for tools like Salesforce, while startups use it to cut weeks of infra work on their side when building AI agents with integrations.
Demo video: https://www.youtube.com/watch?v=07StSRNmJZ8
Our Repos: Metorial: https://github.com/metorial/metorial, MCP Containers: https://github.com/metorial/mcp-containers
SDKs: Node/TypeScript: https://github.com/metorial/metorial-node, Python: https://github.com/metorial/metorial-python
We'd love to hear feedback, especially if you've dealt with deploying MCP at scale!
cgijoe|4 months ago
tobihrbr|4 months ago
solumos|4 months ago
Congrats on the launch!
tobihrbr|4 months ago
rancar2|4 months ago
For those who aren't aware of what FSL (https://fsl.software/) is: "The Functional Source License (FSL) is a Fair Source license that converts to Apache 2.0 or MIT after two years. It is designed for SaaS companies that value both user freedom and developer sustainability. FSL provides everything a developer needs to use and learn from your software without harmful free-riding."
tobihrbr|4 months ago
fsto|4 months ago
wenyers|4 months ago
1. As you said, Composio doesn’t allow self-hosting and the source code isn’t available. We want to follow PostHog’s playbook in letting devs run everything on their own infrastructure and open sourcing all our MCP containers.
2. A huge benefit of this approach is that we can let you fork any MCP server through our dashboard so that you can manage it yourself and make any adjustments you might need. We’ve heard the importance of doing this repeatedly from our enterprise customers.
3. I do believe that we offer more robustness features, like environment provisioning, deployment versioning, server pooling, in-depth logs of server startup, as well as a complete trace of the entire MCP session.
4. On the integrations side, Composio does indeed have more integrations right now, but we already have around 600 MCP servers (all with multiple tools of course) of which many are being modified by us every day to make them better. Since we support open source contributions, the catalog also grows with the community. (Quick note that you can have private servers scoped to your org).
5. I tried to benchmark our architecture vs Composio’s in terms of speed. As we mentioned in the post above, one thing that we spent a lot of time on was optimizing how fast we can do serverless with MCP servers. However, since Composio has neither source available nor any technical documentation on how they handle their servers, I couldn’t actually find any information on their architecture. One thing that they enforce as default is having a meta-tool layer with tools like composio_search_tools and composio_execute_tool. Assuming that this is a long living process, I still found that our implementation returned a list_tools response quicker (including the cold start time). If you factor in the time that it takes for them to find the right tools, their response took close to double the time. While we might explore a similar meta tool layer as an optional MCP server in the future, we do seem to on average have a better architecture in terms of speed, though the benchmarking was not entirely rigorous. (I am also unable to answer how they handle multiple users connecting to one MCP server with different OAuth configs because they don’t share that information). I plan on making a more rigorous comparison in a blog post soon, also comparing to hosting on Vercel, Cloudflare, etc..
Let me know if you have any follow up questions.
If you want to talk more, please feel free to DM me on LinkedIn (https://www.linkedin.com/in/karim-rahme/) or X (https://x.com/wen_rahme).
ushakov|4 months ago
why do I need a specialized platform to deploy MCP instead of just hosting on existing PaaS (Vercel, Railway, Render)?
also if you're not using VMs, how do you isolate per-user servers?
tobihrbr|4 months ago
If you want to run your own remote servers (for your product/company) Railway or Render work great (Vercel is a bit more difficult since Lambdas are very expensive if you run them over long periods of time). Metorial targets developers who build their own AI agents and want to connect them to integrations. Plainly, we do a lot more then running MCP servers; we give you monitoring, observability, handle consumer-facing OAuth, and give you super nice SDKs to integrate MCP servers with your agent.
Regarding the second question, Metorial has three execution modes depending on what the server supports: 1) Docker - this is the most basic one which any MCP server should support. We did some heavy optimizations to get those to start as fast as possible and our hibernation system supports stopping and resuming them while restoring the state. 2) Remote MCP - we connect to remote MCP servers for you, while still giving you the same features and ease-of-integration you get with any Metorial server (I could go more into detail on how our remote servers are better than standard ones). 3) Servers on our own lambda-based runtime. While not every MCP server supports this execution mode, it's what really sets us apart. The Lambdas only run for short intervals, while the connection is managed by our gateway. We already have about 100 lambda-based servers and working on getting more on to that execution model.
There's a lot about our platform that I haven't included in this. Like our stateful MCP proxy, our security model, our scalable SOA, and how we transform OAuth into a single REST API calls for our users.
Let me know if you have any additional questions, always happy to talk about MCP and software architecture.
pylotlight|4 months ago
tobihrbr|4 months ago
TOMDM|4 months ago
Knowing it can integrate with API's is great, but knowing how a consumer of MCP interacts with auth, and how you do so with downstream API's would be very welcome.
tobihrbr|4 months ago
Just for context, it's as simple as 1) creating an OAuth Sessions (https://metorial.com/api/oauth-session) which includes a URL which you 2) pass on to your user's to authenticate at and you're done.
langitbiru|4 months ago
I'm considering adding more chapters to the book: security, easy deployment, etc. So, I may look into your solution. I believe there are other players also, like Klavis AI, FastMCP and some MCP startups that I cannot remember.
Congratz!
electric_muse|4 months ago
Have you written about MCP gateways for helping companies route all MCP traffic through one plane for observability, security, and compliance? Happy to chat through that. I just recorded an end to end demo of what we are working on: https://vimeo.com/1127330739/ee1fe5245b
Eldodi|4 months ago
tobihrbr|4 months ago
unknown|4 months ago
[deleted]
samgutentag|4 months ago
samgutentag|4 months ago