Which is strictly worse than just giving the LLM access to the source of truth for the database.
You're adding a round trip to the database and the LLM and inserting a tool call in the conversation before it even starts generating any code.
And the reference Postgres MCP implementation doesn't include Postgres types or materialized views, and is one of the most widely used packages: Zed.dev's MCP server for example, is seemingly just a port of it and has the same problem.
I don't see how a round trip of <500ms, which is equivalent to maybe 50 tokens, is worse than including many thousands more extra tokens in the prompt, just in case they might be useful. Not to mention the context fatigue.
If designed well - by suspending generation in memory and inserting a <function_result>, without restarting generation and fetching cache from disk - the round trip/tool call is better (costs the equivalent of 50 tokens for waiting + function_result tokens).
PM on the project here - The results from the query are generally not used by the LLM. In agent mode though, during query planning, the agent may retrieve sample of the data to improve precision of the queries. For example, getting distinct values from dimensional table to resolve filter condition from natural language statement.
BoorishBears|9 months ago
You're adding a round trip to the database and the LLM and inserting a tool call in the conversation before it even starts generating any code.
And the reference Postgres MCP implementation doesn't include Postgres types or materialized views, and is one of the most widely used packages: Zed.dev's MCP server for example, is seemingly just a port of it and has the same problem.
fwip|9 months ago
tempaccount420|9 months ago
If designed well - by suspending generation in memory and inserting a <function_result>, without restarting generation and fetching cache from disk - the round trip/tool call is better (costs the equivalent of 50 tokens for waiting + function_result tokens).
nsonha|9 months ago
wredcoll|9 months ago
unknown|9 months ago
[deleted]
layoric|9 months ago
maxluk|9 months ago