Centralized LLM configuration and documentation management system. Enables operations teams to build and manage skills, commands, agents, and prompts. Supports multiple LLMs like Claude Code, Codex, and OpenCode. Integrates with MCP servers for deployment and management.
git clone https://github.com/matteocervelli/llms.gitCentralized LLM configuration and documentation management system. Enables operations teams to build and manage skills, commands, agents, and prompts. Supports multiple LLMs like Claude Code, Codex, and OpenCode. Integrates with MCP servers for deployment and management.
No install command available. Check the GitHub repository for manual installation instructions.
git clone https://github.com/matteocervelli/llmsCopy the install command above and run it in your terminal.
Launch Claude Code, Cursor, or your preferred AI coding agent.
Use the prompt template or examples below to test the skill.
Adapt the skill to your specific use case and workflow.
Create a centralized LLM configuration for [COMPANY] in the [INDUSTRY] sector. The configuration should include skills, commands, agents, and prompts. Support multiple LLMs like Claude Code, Codex, and OpenCode. Ensure integration with MCP servers for deployment and management. Document all configurations and provide a user-friendly interface for operations teams.
# Centralized LLM Configuration for [COMPANY] in the [INDUSTRY] Sector ## Skills - **Data Analysis**: Automates data analysis tasks using Claude Code. - **Customer Support**: Manages customer queries with Codex. - **Code Generation**: Generates and optimizes code with OpenCode. ## Commands - `analyze_data`: Processes and analyzes data sets. - `generate_report`: Creates comprehensive reports from analyzed data. - `support_query`: Handles customer support queries efficiently. ## Agents - **Data Agent**: Specializes in data-related tasks. - **Support Agent**: Focuses on customer support. - **Code Agent**: Handles code generation and optimization. ## Prompts - **Data Analysis Prompt**: "Analyze the provided data set and generate insights." - **Customer Support Prompt**: "Respond to the customer query with a detailed solution." - **Code Generation Prompt**: "Generate optimized code for the given requirements." ## Integration with MCP Servers - **Deployment**: Automated deployment of configurations to MCP servers. - **Management**: Centralized management of all LLM configurations. ## Documentation - **User Guide**: Comprehensive guide for operations teams. - **API Documentation**: Detailed API documentation for developers.
AI assistant built for thoughtful, nuanced conversation
IronCalc is a spreadsheet engine and ecosystem
Service Management That Turns Chaos Into Control
Customer feedback management made simple
Enterprise workflow automation and service management platform
Automate your spreadsheet tasks with AI power
Take a free 3-minute scan and get personalized AI skill recommendations.
Take free scan