Ollama MCP provides local LLM access for AI agents. It exposes Ollama's API to run models locally, manage model libraries, and handle inference requests. Developers use it to integrate local LLMs into AI assistants, avoiding cloud dependencies. Connects directly to Ollama's local server.
Ollama MCP provides local LLM access for AI agents. It exposes Ollama's API to run models locally, manage model libraries, and handle inference requests. Developers use it to integrate local LLMs into AI assistants, avoiding cloud dependencies. Connects directly to Ollama's local server.
npx -y ollama-mcpAdd this configuration to your claude_desktop_config.json:
{
"mcpServers": {
"rawveg-ollama-mcp-github": {
"command": "npx",
"args": [
"-y",
"npx -y ollama-mcp"
]
}
}
}Restart Claude Desktop, then ask:
"What tools do you have available from ollama mcp?"
API Key Required
This server requires an API key from ollama mcp. Add it to your environment or config.
| Variable | Required | Description |
|---|---|---|
| OLLAMA_MCP_API_KEY | Yes | Your ollama mcp API key |
"What resources are available in ollama mcp?"
Claude will query available resources and return a list of what you can access.
"Show me details about [specific item] in ollama mcp"
Claude will fetch and display detailed information about the requested item.
"Create a new [item] in ollama mcp with [details]"
Claude will use the appropriate tool to create the resource and confirm success.
We build custom MCP integrations for B2B companies. From simple connections to complex multi-tool setups.