The Supabase Postgres Best Practices skill helps teams to optimize their Postgres databases for performance and efficiency. By implementing proven strategies, users can enhance application responsiveness and reduce operational costs, driving better outcomes for Sales, Marketing, and RevOps teams.
claude install supabase/agent-skills/supabase-postgres-best-practicesUnlock the full potential of your Supabase Postgres database with our expertly crafted Best Practices skill. Tailored for Sales, Marketing, and RevOps teams, this skill provides actionable insights that help streamline operations, enhance data security, and improve overall application performance. Implementing these best practices not only saves valuable time but also reduces costs, ultimately driving better business outcomes. With a focus on practical use cases, users will benefit from optimized query performance, faster data retrieval, and robust security measures. By adopting these strategies, teams can achieve significant time savings in their daily operations, allowing them to focus on core business activities and strategic initiatives. Experience the transformative impact of effective database management with Supabase Postgres Best Practices.
1. **Prepare Your Database**: Ensure you have admin access to your Supabase project and the latest backup of your database. 2. **Run the Analysis**: Use the prompt template in your AI tool (like Claude or ChatGPT) to get specific optimization recommendations. 3. **Review Recommendations**: Carefully review each suggestion with your development team, considering potential impacts on your application. 4. **Implement Changes**: Apply the recommended changes in a staging environment first, then monitor performance before deploying to production. 5. **Monitor Results**: Use Supabase's built-in monitoring tools or the recommended queries to track performance improvements after implementation. Tip: For complex databases, consider running this analysis periodically as your data volume and query patterns evolve.
Optimize complex SQL queries to improve execution time.
Streamline data retrieval processes for faster application responses.
Implement advanced indexing strategies to enhance query performance.
Review and refine database schemas to eliminate performance bottlenecks.
claude install supabase/agent-skills/supabase-postgres-best-practicesgit clone https://github.com/supabase/agent-skillsCopy the install command above and run it in your terminal.
Launch Claude Code, Cursor, or your preferred AI coding agent.
Use the prompt template or examples below to test the skill.
Adapt the skill to your specific use case and workflow.
Analyze the current schema and configuration of our Supabase Postgres database for [PROJECT_NAME]. Identify 3-5 specific optimizations we should implement to improve performance and efficiency. For each recommendation, explain the expected benefit and provide the SQL commands or configuration changes needed to implement it.
After reviewing your Supabase Postgres database for the 'Customer Insights Dashboard' project, I've identified several optimization opportunities:
1. **Index Optimization**: Your 'user_activity' table has a high write volume but lacks proper indexes. Adding a partial index on the 'last_active' column for active users (last_active > NOW() - INTERVAL '30 days') would significantly improve query performance for your active user analytics. Implementation:
```sql
CREATE INDEX idx_user_activity_active ON user_activity(last_active) WHERE last_active > NOW() - INTERVAL '30 days';
```
2. **Connection Pooling**: Your current connection pool settings are suboptimal for your traffic patterns. I recommend adjusting the pool settings to:
```sql
ALTER SYSTEM SET max_connections = 200;
ALTER SYSTEM SET shared_buffers = '4GB';
```
3. **Query Optimization**: Several of your frequently run queries can be optimized. For example, the daily revenue calculation query can be improved by:
```sql
CREATE MATERIALIZED VIEW daily_revenue AS SELECT date_trunc('day', created_at) AS day, SUM(amount) AS total FROM transactions GROUP BY day;
```
4. **Storage Configuration**: Enable auto-vacuum for your large tables to maintain performance during heavy write operations:
```sql
ALTER TABLE large_data_table SET (autovacuum_enabled = true);
```
5. **Monitoring Setup**: Implement regular performance monitoring with these queries:
```sql
-- Slow queries
SELECT query, calls, total_exec_time FROM pg_stat_statements ORDER BY total_exec_time DESC LIMIT 10;
-- Lock contention
SELECT blocked_locks.pid AS blocked_pid, blocking_locks.pid AS blocking_pid, blocked_activity.query AS blocked_statement, blocking_activity.query AS blocking_statement
FROM pg_catalog.pg_locks blocked_locks
JOIN pg_catalog.pg_stat_activity blocked_activity ON blocked_activity.pid = blocked_locks.pid
JOIN pg_catalog.pg_locks blocking_locks ON blocking_locks.locktype = blocked_locks.locktype AND blocking_locks.DATABASE IS NOT DISTINCT FROM blocked_locks.DATABASE AND blocking_locks.relation IS NOT DISTINCT FROM blocked_locks.relation AND blocking_locks.page IS NOT DISTINCT FROM blocked_locks.page AND blocking_locks.tuple IS NOT DISTINCT FROM blocked_locks.tuple AND blocking_locks.virtualxid IS NOT DISTINCT FROM blocked_locks.virtualxid AND blocking_locks.transactionid IS NOT DISTINCT FROM blocked_locks.transactionid AND blocking_locks.classid IS NOT DISTINCT FROM blocked_locks.classid AND blocking_locks.objid IS NOT DISTINCT FROM blocked_locks.objid AND blocking_locks.objsubid IS NOT DISTINCT FROM blocked_locks.objsubid AND blocking_locks.pid != blocked_locks.pid
JOIN pg_catalog.pg_stat_activity blocking_activity ON blocking_activity.pid = blocking_locks.pid;
```
These optimizations should significantly improve your database performance and reduce operational costs.PostgreSQL backend-as-a-service with real-time APIs
We create engaging workshops for companies and private events centred around plants, flowers and all things botanical.
Orchestrate workloads with multi-cloud support, job scheduling, and integrated service discovery features.
Design, document, and generate code for APIs with interactive tools for developers.
CI/CD automation with build configuration as code
Enhance performance monitoring and root cause analysis with real-time distributed tracing.
Take a free 3-minute scan and get personalized AI skill recommendations.
Take free scan