Collaborating-with-codex enables AI agents to delegate coding tasks to OpenAI Codex CLI. Operations teams benefit from multi-model collaboration on coding tasks. Integrates with Python-based workflows and supports Claude agents.
git clone https://github.com/GuDaStudio/collaborating-with-codex.gitCollaborating-with-codex enables AI agents to delegate coding tasks to OpenAI Codex CLI. Operations teams benefit from multi-model collaboration on coding tasks. Integrates with Python-based workflows and supports Claude agents.
No install command available. Check the GitHub repository for manual installation instructions.
git clone https://github.com/GuDaStudio/collaborating-with-codexCopy the install command above and run it in your terminal.
Launch Claude Code, Cursor, or your preferred AI coding agent.
Use the prompt template or examples below to test the skill.
Adapt the skill to your specific use case and workflow.
Act as a collaborator with Codex. I will provide you with a programming task and you will generate code snippets, suggest improvements, and help me debug. Here's the task: [TASK]. The programming language is [LANGUAGE]. The goal is [GOAL]. Provide me with code snippets, explanations, and suggestions for improvement.
# Programming Task: Web Scraper for [COMPANY]
## Task Description
Create a web scraper to extract product data from [COMPANY]'s e-commerce website. The scraper should collect product names, prices, and descriptions.
## Code Snippet
```python
import requests
from bs4 import BeautifulSoup
url = 'https://www.[COMPANY].com/products'
response = requests.get(url)
soup = BeautifulSoup(response.text, 'html.parser')
products = []
for product in soup.find_all('div', class_='product'):
name = product.find('h2').text
price = product.find('span', class_='price').text
description = product.find('p', class_='description').text
products.append({'name': name, 'price': price, 'description': description})
```
## Suggestions for Improvement
1. **Error Handling**: Add error handling for network requests and parsing.
2. **Rate Limiting**: Implement rate limiting to avoid overwhelming the server.
3. **Data Storage**: Consider storing the scraped data in a database or CSV file.
4. **User-Agent**: Use a user-agent header to mimic a browser request.
## Next Steps
1. Test the code with the provided URL.
2. Implement the suggested improvements.
3. Run the scraper and verify the output.
Your one-stop shop for church and ministry supplies.
Automate your browser workflows effortlessly
IronCalc is a spreadsheet engine and ecosystem
Customer feedback management made simple
Enterprise workflow automation and service management platform
Automate your spreadsheet tasks with AI power