Show HN: I built an MCP server using Cloudflare's code mode pattern
87 points| jmcodes | 5 months ago |github.com
HN Discussion: https://news.ycombinator.com/item?id=45399204 https://news.ycombinator.com/item?id=45386248
Deno provides a great sandbox environment for Typescript code execution because of its permissions system which made it easy to spin up code that only has access to fetch and network calls.
Stick an MCP proxy on top of that and you've got "CodeMode" (code intermixed with MCP tool calls) for more advanced workflow orchestration.
https://github.com/jx-codes/codemode-mcp
There's a lot of things that can be improved here. Like a virtual file system for the agent to actually build up its solution instead of being forced to one shot the solution but the bones are there.
WilcoKruijer|5 months ago
jmcodes|5 months ago
Yeah since it's using Deno it'd be cool just use Deno throughout. Definitely gotta clean up the code quite a bit.
cpard|5 months ago
You can see the same behavior if you try to ask an LLM to code in an API that is not commonly used.
When it comes to MCP tooling I followed a different path but with similar assumptions.
There are tools that LLMs have been Rled to death to use. So I’m modeling my tools after them.
Specifically, I try to have a “glob” tool, used to let the LLM figure out structure. A search and a read tool and use regexp as much as possible for passing parameters.
You can see an early version of this pattern here: https://github.com/typedef-ai/fenic/blob/main/examples/mcp/d...
It has been working well, at least in terms of the model knowing how to invoke and use the tools.
I have to say though that each model is different. I see differences between Claude code and Codex when I use the MCP for development, at least on how good they are in retrieving the information they need.
Maybe I should try to run some benchmarking and compare more formally
Palmik|4 months ago
[1] Not all, some use different formats, e.g. GLM uses plain text with special tokens to denote param names and values.
gavmor|5 months ago
Wait, really? This is harder to get right:
``` { "jsonrpc": "2.0", "id": 102, "method": "tools/call", "params": { "name": "book_flight", "arguments": { "origin": "SFO", "destination": "JFK", "departureDate": "2025-10-15", "returnDate": "2025-10-18", "passengers": 2, "cabinClass": "business" } } } ```
Than the equivalent... but with `method: "POST"` boilerplate, etc? Or is it literally the chaining of tools that's missing from—and fundamentally faulty in—MCP client implementations?
jmcodes|5 months ago
The fetch code isn't any better than the tool code I agree, but typescript code is more common so I'd guess this would be too?
But anyway I think the real power comes with the type-safety part that I left out this morning (working on it now). From what I understand Cloudflare is essentially generating an SDK for the LLM to write code against.
Instead of writing that fetch call. The LLM would generate
``` const redditResults = await redditMCP_getTopPosts(subreddit); const insertMutation = await duckdb_Insert("SQL STUFF", redditResults.map(...)); const results = await duckDb_Query(args: duckDb_QueryArgs); return resultsInSomeNiceFormat; ```
Where the method names come from the MCP server tools, and the argument types are autogenerated from the MCP schemas themselves.
No idea if this is a valuable workflow or not personally. I just thought it was cool and wanted to tinker with it.
jmcodes|5 months ago
For those of you interested, I wrote out and built an more RPC typescript centric approach to avoid using other MCP servers at all. Would appreciate some thoughts!
https://news.ycombinator.com/item?id=45420133
fabmilo|5 months ago
jmcodes|5 months ago
If you run `deno check` before executing the code you'd get the type-safety loop (working on this now)
Later I want to see what'd happen if you give the LLM a repo of sorts to store useful snippets and functions with comments for later use. So the LLM itself would save workflows, be able to import them into the Deno environment and chain those together.
It definitely needs a prompt that tells it to use the MCP server but I can see it being pretty powerful.
I only did simple tests like get Reddit posts, their comments, find the weather on those days, stick them in duckdb, and run some social media metric queries.
I could see that same test being: "find me leads, filter by keywords, run against some parquet file stored somewhere using duckdb, craft an email for my boss."
I'm kind of ranting but I think this a pretty exciting approach.
Edit: GraphQL style codegen layer but for all your APIs seems like a pretty obvious middle layer for this, maybe next weekend.
nivertech|5 months ago
Just because an agent “lives” in the environment, doesn’t make it RL. It needs a reward function, or even better something like Gym.
manojlds|5 months ago
jmcodes|5 months ago
One thing I ran into is that since the RPC calls are independent Deno processes, you can't keep say DuckDB or SQLite open.
But since it's just typescript on Deno. I can just use a regular server process instead of MCP, expose it through the TS RPC files I define, and the LLM will have access to it.
https://github.com/jx-codes/mcp-rpc https://news.ycombinator.com/item?id=45420133
evertedsphere|5 months ago
https://lucumr.pocoo.org/2025/8/18/code-mcps/
kordlessagain|5 months ago
1. Tool calls are intentionally simple; adding a code generation layer introduces needless complexity and failure points.
2. Cloudflare already acts as a man-in-the-middle for ~20% of the Internet with limited transparency about government data requests
3. This is clearly designed to drive adoption of their Worker platform and create lock-in for AI agent developers
Similar to their x402 payment scheme that reinvents HTTP 402 under their control, the community has already built alternatives (see the Aperture implementation from Lightning) that don't require surrendering more of your stack to Cloudflare.
Remember what's happening here: a company with unprecedented visibility into web traffic wants even more control over how AI agents interact with the internet. Even if you don't believe that AI will eventually self govern itself, this is a horrible idea to limit individual's ability to automate portions of their access to the web.
No thanks.
dennisy|5 months ago
https://huggingface.co/blog/smolagents
danielser|5 months ago
In which case your likely wrong, people do want it, and AI will be very good at orchestrating simple patterns.
CF definitely has a vested interest.. The problem for them now that I see is that THEY DIDN"T ACTUALLY LAUNCH IT... but did describe what it is/does in complete detail.
Now there are gonna be dozens of non CF locked clones, just like the one OP linked.
luckydata|5 months ago
danielser|5 months ago
So once it has the API shape in memory, it could make dozens of tool calls in a single call.
It isn't about token saving, its about time/efficiency of tool usage/response time accumulation.
Instead of 20 separate tool calls one after the other, you get one larger, orchestrated one that only returns exactly what it needed.
lucis|5 months ago