This MCP server provides documentation for setting up a local, private LLM server on Debian. It includes tools for chat, web search, RAG, model management, MCP servers, image generation, and TTS. It connects to various AI models and APIs for these functionalities. Developers use it to deploy and manage private LLM servers with advanced capabilities.
This MCP server provides documentation for setting up a local, private LLM server on Debian. It includes tools for chat, web search, RAG, model management, MCP servers, image generation, and TTS. It connects to various AI models and APIs for these functionalities. Developers use it to deploy and manage private LLM servers with advanced capabilities.
Add this configuration to your claude_desktop_config.json:
{
"mcpServers": {
"varunvasudeva1-llm-server-docs-github": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-varunvasudeva1-llm-server-docs-github"
]
}
}
}Restart Claude Desktop, then ask:
"What tools do you have available from llm server docs?"
API Key Required
This server requires an API key from llm server docs. Add it to your environment or config.
| Variable | Required | Description |
|---|---|---|
| LLM_SERVER_DOCS_API_KEY | Yes | Your llm server docs API key |
"What resources are available in llm server docs?"
Claude will query available resources and return a list of what you can access.
"Show me details about [specific item] in llm server docs"
Claude will fetch and display detailed information about the requested item.
"Create a new [item] in llm server docs with [details]"
Claude will use the appropriate tool to create the resource and confirm success.
We build custom MCP integrations for B2B companies. From simple connections to complex multi-tool setups.