The ellama MCP server provides Emacs integration for large language models. It exposes functions to send prompts, receive responses, and manage conversations. The server connects to local or remote LLMs via APIs. Developers use it to build Emacs-based AI workflows, such as code generation, documentation, and chat interfaces.
The ellama MCP server provides Emacs integration for large language models. It exposes functions to send prompts, receive responses, and manage conversations. The server connects to local or remote LLMs via APIs. Developers use it to build Emacs-based AI workflows, such as code generation, documentation, and chat interfaces.
Add this configuration to your claude_desktop_config.json:
{
"mcpServers": {
"s-kostyaev-ellama-github": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-s-kostyaev-ellama-github"
]
}
}
}Restart Claude Desktop, then ask:
"What tools do you have available from ellama?"
No configuration required. This server works out of the box.
"What resources are available in ellama?"
Claude will query available resources and return a list of what you can access.
"Show me details about [specific item] in ellama"
Claude will fetch and display detailed information about the requested item.
"Create a new [item] in ellama with [details]"
Claude will use the appropriate tool to create the resource and confirm success.
See what tools in your stack can connect to AI.
We build custom MCP integrations for B2B companies. From simple connections to complex multi-tool setups.