Optillm is an inference proxy that improves LLM accuracy on reasoning tasks. It connects to LLMs via API, providing optimized inference without retraining. Developers use it to enhance model performance for tasks requiring logical reasoning.
Optillm is an inference proxy that improves LLM accuracy on reasoning tasks. It connects to LLMs via API, providing optimized inference without retraining. Developers use it to enhance model performance for tasks requiring logical reasoning.
pip install optillmAdd this configuration to your claude_desktop_config.json:
{
"mcpServers": {
"algorithmicsuperintelligence-optillm-github": {
"command": "uvx",
"args": [
"pip install optillm"
]
}
}
}Restart Claude Desktop, then ask:
"What tools do you have available from optillm?"
API Key Required
This server requires an API key from optillm. Add it to your environment or config.
| Variable | Required | Description |
|---|---|---|
| OPTILLM_API_KEY | Yes | Your optillm API key |
"What resources are available in optillm?"
Claude will query available resources and return a list of what you can access.
"Show me details about [specific item] in optillm"
Claude will fetch and display detailed information about the requested item.
"Create a new [item] in optillm with [details]"
Claude will use the appropriate tool to create the resource and confirm success.
We build custom MCP integrations for B2B companies. From simple connections to complex multi-tool setups.