top | item 43307225

(no title)

ondrsh | 11 months ago

Exactly. An AI-web based on the principles of HATEOAS is the next step, where instead of links, we would have function calls.

As you said, HATEOAS requires a generic client that can understand anything at runtime — a client with general intelligence. Until recently, humans were the only ones fulfilling that requirement. And because we suck at reading JSON, HATEOAS had to use HTML. Now that we have strong AI, we can drop the Hypermedia from 'H'ATEOAS and use JSON instead.

I wrote about that exact thing in Part 2: https://www.ondr.sh/blog/ai-web

discuss

order

thierrydamiba|11 months ago

Both blog posts were excellent. Thanks for the breakdown.

I’m bullish on MCP-what is are some non-obvious things I shod consider that might dampen my fire?

ondrsh|11 months ago

TL;DR: IMHO, the MCP enforces too much structure, which makes it vulnerable to disruption by less structured protocols that can evolve according to user needs.

The key reason the web won out over Gopher and similar protocols was that the early web was stupidly simple. It had virtually no structure. In fact, the web might have been the greatest MVP of all time: it handed server developers a blank canvas with as few rules as possible, leading to huge variance in outputs. Early websites differed far more from each other than, for example, Gopher sites, which had strict rules on how they had to work and look.

Yet in a server-client "ping-pong" system, higher variance almost always wins. Why? Because clients consume more of what they like and less of what they don't. This creates an evolutionary selection process: bad ideas die off, and good ideas propagate. Developers naturally seem to develop what people want, but they are not doing so by deliberate choice — the evolutionary process makes it appear so.

The key insight is that the effectiveness of this process stems from a lack of structure. A lack of structure leads to high variance, which lets the protocol escape local minima and evolve according to user needs.

The bear case for MCP is that it's going the exact opposite route. It comes with tons of features, each adding layers of abstractions and structure. While that might work in narrowly understood fields, it's much harder to pull off in novel domains where user preferences aren't clear — knowing what users want is hard. The MCP's rigid structure inherently limits variance in server styles (a trend already observable IMHO), making MCP vulnerable to competition by newer, less structured protocols — similar to how the web steamrolled Gopher, even though the latter initially seemed too far ahead to catch. The fact that almost all MCP servers are self-contained (they don't link to other MCP servers) further means the current lead is not as effective, as the lock-in effect is weaker.