top | item 44676066

Building MCP servers for ChatGPT and API integrations

70 points| kevinslin | 7 months ago |platform.openai.com

30 comments

order

jngiam1|7 months ago

I find the state of MCPs with OpenAI quite confusing:

- As an end-user, you can connect MCPs only with `search` and `fetch` that only work in deep research mode.

- As a developer, you can use MCP with the API and that supports the full set of MCP tools - all tools become available; this shows in the dev playground.

- For Custom GPTs, they support any action, but not MCPs. So if you had a layer to translate MCP to their API spec, it will work. But Custom GPTs with actions only support models 4o and 4.1; so you don't get the benefit of the o-series of models.

Figuring out what works when is harder than it needs to be.

randomjoe2|7 months ago

OpenAI has fallen behind, way behind. Stop trying to benchmark off openai, it's pointless imo. They're not even playing the same game as everyone else.

lherron|7 months ago

The lack of MCP support in their desktop client is especially disappointing considering everything Anthropic has shipped on Claude Code the last few weeks.

ChatGPT desktop client with only search/fetch MCPs is far, far inferior to CC from a utility/value perspective.

jimmydoe|7 months ago

It feels like they don’t have a good product strategy that matches their engineering throughput.

ascorbic|7 months ago

Has something changed? That seems to be the same page they've had for a couple of months. It's only for deep research mode, and is restricted to Pro and Enterprise.

jimmydoe|7 months ago

Same. I thought it’s now available for Plus, but apparently not the case

It’s disappointing they are gating this and browser agent for $100 users, and even for 100, it’s only two tool methods.

cube2222|7 months ago

The support here is really weird.

If I understand correctly, it requires your MCP server to have exactly two tools - search and fetch.

So this is not really support for MCP in general, as in all the available MCP servers. It’s support for their own custom higher-level protocol built on top of MCP.

asabla|7 months ago

For ChatGPT and DeepResearch yes, not when using the API. I guess you could just return empty results if you want to offer other tools as well (can't test it now, since custom connectors only supports Workspace or PRO accounts for this moment).

Quote we're talking about: > To work with ChatGPT Connectors or deep research (in ChatGPT or via API), your MCP server must implement two tools - search and fetch.

Reference links:

- Using remote MCP servers with the API: https://platform.openai.com/docs/guides/tools-remote-mcp

- Which account types can setup custom connectors in ChatGPT: https://help.openai.com/en/articles/11487775-connectors-in-c...

monadoid|7 months ago

Yeah I hope they open up support to all MCP tools - this is lame as-is.

varunneal|7 months ago

this guide is just for an example of how to build a single mcp (e.g. a vector store). chatgpt connectors implement mcps in general now

maxwellg|7 months ago

It is very nice to get MCP support in ChatGPT. OpenAI really fumbled the bag with the OpenAPI-based Custom Actions (or was it Custom GPTs?). The web editor experience was always incredibly buggy, even months after initial release. MCP servers allow us to move nearly all of the tool definition bits into the server codebase itself, so we can change things on the fly / version control / feature flag tools etc. much better.

babyshake|7 months ago

Is it safe to say that MCP has "won" vs. A2A? Or is this a misreading of the situation?

d_watt|7 months ago

They're not directly solving the same problem. MCP is for exposing tools, such as reading files. a2a is for agents to talk to other agents to collaborate.

MCP servers can expose tools that are agents, but don't have to, and usually don't.

That being said, I can't say I've come across an actual implementation of a2a outside of press releases...

miguelxpn|7 months ago

I think they have different use cases. MCP is for tool calling, A2A for agents communicating between themselves.

kinduff|7 months ago

I like how they plugged the MCP support into their existing tools. Pretty smart!

DiabloD3|7 months ago

I love how they don't actually explain why I (or anyone else) would ever implement their API.

Given how disastrous the AI 'industry' has been, between misappropriating data from customers, performing actions on behalf of customers that lead to data and/or financial loss, and then seeking protection from the law in one or more cases of these, isn't providing an MCP service essentially requiring you to notify customers of a GDPR-or-similar data compromise event at some point in the future when it suddenly but inevitably betrays you?

Like, isn't OpenAI just leading people to a footgun and then kindly asking them to use it, for the betterment of OpenAI's bottom line, which was significantly in the red for FY24?

randomjoe2|7 months ago

You think they need to convince you on the concept of AIs in this article?

abletonlive|7 months ago

Ah yes, “disastrous” not quoted but felt the need to put into quotation “industry” as if it’s not the biggest shift that has happened in your entire lifetime. That is intellectually dishonest and incongruent with the reality that you live in

ipsum2|7 months ago

Title is mildly misleading, is only available for the API, not the web/mobile interface.