9router is an AI proxy that routes requests to multiple AI models like Claude Code, Codex, Cursor, OpenAI, Gemini, and Copilot. It benefits developers and operations teams by abstracting away model-specific APIs, enabling integration into existing workflows and tools.
git clone https://github.com/decolua/9router.git9router is an AI proxy that routes requests to multiple AI models like Claude Code, Codex, Cursor, OpenAI, Gemini, and Copilot. It benefits developers and operations teams by abstracting away model-specific APIs, enabling integration into existing workflows and tools.
[{"step":"Define your task and criteria. Specify whether you prioritize speed, cost, accuracy, or model-specific strengths (e.g., code generation, creative writing).","tip":"Use 9router’s built-in criteria like `fastest`, `cheapest`, or `best_for_code` to simplify routing. For example: 'Route to the fastest model for [TASK].'"},{"step":"Integrate 9router into your workflow. If using an IDE like Cursor or VS Code, configure the 9router plugin to handle API calls. For CLI tools, set 9router as a proxy for your AI requests.","tip":"Check 9router’s documentation for IDE-specific setup. For CLI, use environment variables like `export AI_PROXY=9router` before running your scripts."},{"step":"Send your request through 9router. Include task details and any constraints (e.g., 'Generate a React component with TypeScript, prioritize speed').","tip":"Use placeholders like [TASK] or [CRITERIA] in your prompts to make them reusable. For example: 'Route [TASK] to the model best suited for [CRITERIA].'"},{"step":"Review the routed model, request, and output. Verify that the selected model aligns with your criteria and that the output meets your standards.","tip":"If the output isn’t optimal, adjust your criteria or task description and reroute. For example, switch from 'fastest' to 'most accurate' if the initial output is flawed."},{"step":"Automate repetitive tasks. Use 9router in scripts or CI/CD pipelines to dynamically route AI requests based on workload or model availability.","tip":"For automation, log the routed model and output for debugging. Example: `echo \"Routed to $(9router get-model) for task: $TASK\" >> routing_log.txt`"}]
No install command available. Check the GitHub repository for manual installation instructions.
git clone https://github.com/decolua/9routerCopy the install command above and run it in your terminal.
Launch Claude Code, Cursor, or your preferred AI coding agent.
Use the prompt template or examples below to test the skill.
Adapt the skill to your specific use case and workflow.
Use 9router to route a [TASK_DESCRIPTION] to the most suitable AI model. Prioritize [CRITERIA] such as cost efficiency, response speed, or task-specific strengths (e.g., code generation, creative writing, or data analysis). Return the model used, the routed request, and the final output. Example: 'Route a request to generate a Python script for web scraping with error handling to the fastest available model in 9router.'
### 9router Output
**Routed Request:** Generate a Python script for web scraping with error handling, prioritizing speed and reliability.
**Model Selected:** Claude Code (via 9router)
**Reason:** Optimized for Python code generation with robust error handling and fast execution.
**Generated Script:**
```python
import requests
from bs4 import BeautifulSoup
import logging
from urllib.parse import urljoin
# Configure logging
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
class WebScraper:
def __init__(self, base_url):
self.base_url = base_url
self.session = requests.Session()
self.session.headers.update({'User-Agent': 'Mozilla/5.0'})
def fetch_page(self, url):
try:
response = self.session.get(url, timeout=10)
response.raise_for_status()
return response.text
except requests.RequestException as e:
logger.error(f"Failed to fetch {url}: {e}")
return None
def parse_links(self, html, base_url):
soup = BeautifulSoup(html, 'html.parser')
links = set()
for link in soup.find_all('a', href=True):
absolute_url = urljoin(base_url, link['href'])
links.add(absolute_url)
return links
def scrape(self):
html = self.fetch_page(self.base_url)
if not html:
return []
return self.parse_links(html, self.base_url)
# Example usage
if __name__ == "__main__":
scraper = WebScraper("https://example.com")
links = scraper.scrape()
print(f"Found {len(links)} links:")
for link in links:
print(link)
```
**Verification:** The script includes error handling for network requests, uses a timeout, and logs failures. It also handles relative URLs by converting them to absolute paths. The model ensured the script is production-ready with clear structure and comments.AI assistant built for thoughtful, nuanced conversation
Google's multimodal AI model and assistant
AI-first code editor
Advanced foundation models via API and ChatGPT
Enterprise workflow automation and service management platform
Automate your spreadsheet tasks with AI power
Take a free 3-minute scan and get personalized AI skill recommendations.
Take free scan