Jinni provides a server that exposes a project's context to LLMs. It connects to GitHub repositories, allowing agents to access code, documentation, and other project resources. Developers use it to integrate their projects with LLMs for enhanced AI interactions, such as code analysis, documentation generation, and issue resolution.
Jinni provides a server that exposes a project's context to LLMs. It connects to GitHub repositories, allowing agents to access code, documentation, and other project resources. Developers use it to integrate their projects with LLMs for enhanced AI interactions, such as code analysis, documentation generation, and issue resolution.
pip install jinniAdd this configuration to your claude_desktop_config.json:
{
"mcpServers": {
"smat-dev-jinni-github": {
"command": "uvx",
"args": [
"pip install jinni"
]
}
}
}Restart Claude Desktop, then ask:
"What tools do you have available from jinni?"
No configuration required. This server works out of the box.
"What resources are available in jinni?"
Claude will query available resources and return a list of what you can access.
"Show me details about [specific item] in jinni"
Claude will fetch and display detailed information about the requested item.
"Create a new [item] in jinni with [details]"
Claude will use the appropriate tool to create the resource and confirm success.
We build custom MCP integrations for B2B companies. From simple connections to complex multi-tool setups.