The instrumentation-mcp server provides observability for LLM applications. It exposes metrics, traces, and logs to monitor AI agent performance. It integrates with OpenTelemetry for telemetry data collection and supports various backends like Jaeger, Prometheus, and Grafana. Developers use it to debug, optimize, and maintain LLM applications by tracking request latency, error rates, and usage patterns.
The instrumentation-mcp server provides observability for LLM applications. It exposes metrics, traces, and logs to monitor AI agent performance. It integrates with OpenTelemetry for telemetry data collection and supports various backends like Jaeger, Prometheus, and Grafana. Developers use it to debug, optimize, and maintain LLM applications by tracking request latency, error rates, and usage patterns.
npx -y @traceloop/instrumentation-mcpAdd this configuration to your claude_desktop_config.json:
{
"mcpServers": {
"traceloop-instrumentation-mcp-npm": {
"command": "npx",
"args": [
"-y",
"npx -y @traceloop/instrumentation-mcp"
]
}
}
}Restart Claude Desktop, then ask:
"What tools do you have available from instrumentation-mcp?"
API Key Required
This server requires an API key from instrumentation-mcp. Add it to your environment or config.
| Variable | Required | Description |
|---|---|---|
| INSTRUMENTATION-MCP_API_KEY | Yes | Your instrumentation-mcp API key |
"What resources are available in instrumentation-mcp?"
Claude will query available resources and return a list of what you can access.
"Show me details about [specific item] in instrumentation-mcp"
Claude will fetch and display detailed information about the requested item.
"Create a new [item] in instrumentation-mcp with [details]"
Claude will use the appropriate tool to create the resource and confirm success.
We build custom MCP integrations for B2B companies. From simple connections to complex multi-tool setups.