top | item 43486829

(no title)

nlarew | 11 months ago

> What is there to standardize?

At a high level, the request format and endpoints. Instead of needing to write a bespoke connector for every type of context that matches their preferred API standards, I just tell my client that the server exists and the standard takes care of the rest.

Do you have similar doubts about something like gRPC?

> This looks like micro services crossed with AI.

Seems like a cynical take with no substance to me. What about a standard request protocol implies anything about separation of concerns, scaling, etc?

discuss

order

bob1029|11 months ago

> At a high level, the request format and endpoints.

I think we fundamentally disagree on what "request format" means in context of a large language model.

Spivak|11 months ago

Because it's not a request format for LLMs, it's a request format for client software that is instrumenting LLMs. If you make a connector to say HomeAssistant to turn on/off your lights you're exposing a tool definition which is really just a JSON schema. The agent will present that tool to the LLM as one it's allowed to use, validate that the LLM matched your change_light_state tool schema and send off the appropriate API call to your server.

The spec is genuinely a hot fucking mess that looks like a hobby project by an overeager junior dev but conceptually it's just a set of JSON schemas to represent common LLM things (prompts, tools, files) and some verbs.

The useful content of the spec is literally just https://github.com/modelcontextprotocol/specification/blob/m... and even then it's a bit much.