top | item 44011569

Postman for MCP

31 points| andes314 | 9 months ago |usetexture.com

10 comments

order

notpushkin|9 months ago

Please note that posting the same thing multiple times in succession is frowned upon here (but congrats getting to the first page!)

I’m also confused by the title – why “Postman”? I do know about Postman the HTTP client, but I don’t get the parallel here.

andes314|9 months ago

This platform is an mcp client and mcp server creator tool. You make the mcp servers from APIs. It is a “Postman” tool in two ways

TZubiri|9 months ago

"error, please clear and try again: Error code: 429 - {'type': 'error', 'error': {'type': 'rate_limit_error', 'message': 'This request would exceed the rate limit for your organization (67777b05-661c-4183-aa19-ec6e299f95ac) of 50,000 input tokens per minute. }}

This is a very bad idea buddy. Maybe try letting users set their API tokens.

phildenhoff|9 months ago

andes314, can you expand on how you see this as Postman for MCP?

andes314|9 months ago

It is a platform to create MCP servers from API endpoints, and then chat with them without having to use Claude’s clunky integration process. It is simple and complete.

andes314|9 months ago

If it enters hug of death, please try Claude client to test it! Anthropic allows me only 30req/s at the moment

fcarraldo|9 months ago

How much are you spending on tokens right now?

DSchau|9 months ago

[deleted]

cr125rider|9 months ago

Does anyone use postman here still? It was very bloated and not great at some basic things I wanted to do quickly. It felt pretty closed off and proprietary when I just wanted to safe some of my queries in gif for my QA team.

andes314|9 months ago

My platform goes beyond being an automatic wrapper of an API and lets you specify to the model, in natural language, how it should parse inputs and outputs. I find LLMs are very responsive to this type of specification, and to the best of my knowledge no one is trying this yet.

You also don’t seem to offer a simple chat-based client.