The Open Responses Server wraps OpenAI API interfaces as Responses with MCP support, enabling Codex compatibility. It adds stateful features for Ollama and VLLM compliance. It exposes OpenAI's Responses API, allowing AI agents to interact with various AI backends. Developers use it to integrate different AI models into their applications.
The Open Responses Server wraps OpenAI API interfaces as Responses with MCP support, enabling Codex compatibility. It adds stateful features for Ollama and VLLM compliance. It exposes OpenAI's Responses API, allowing AI agents to interact with various AI backends. Developers use it to integrate different AI models into their applications.
pip install open-responses-serverAdd this configuration to your claude_desktop_config.json:
{
"mcpServers": {
"teabranch-open-responses-server-github": {
"command": "uvx",
"args": [
"pip install open-responses-server"
]
}
}
}Restart Claude Desktop, then ask:
"What tools do you have available from open responses server?"
API Key Required
This server requires an API key from open responses server. Add it to your environment or config.
| Variable | Required | Description |
|---|---|---|
| OPEN_RESPONSES_SERVER_API_KEY | Yes | Your open responses server API key |
"What resources are available in open responses server?"
Claude will query available resources and return a list of what you can access.
"Show me details about [specific item] in open responses server"
Claude will fetch and display detailed information about the requested item.
"Create a new [item] in open responses server with [details]"
Claude will use the appropriate tool to create the resource and confirm success.
We build custom MCP integrations for B2B companies. From simple connections to complex multi-tool setups.