12-25-2025, 06:01 AM
I'm a backend developer working on a PostgreSQL database for a SaaS application, and our main customer table has grown to several million rows. We're experiencing painfully slow query performance on searches that involve multiple columns and date ranges, even though I've added basic indexes. I think I need a more sophisticated indexing strategy but I'm unsure where to start. For database administrators or developers who have optimized similar workloads, what are the best practices for database indexing in this scenario? Should I be looking into composite indexes, partial indexes, or maybe BRIN indexes for the timestamps? How do you analyze query plans to identify missing indexes without blindly adding them and hurting write performance, and what tools or queries do you use to monitor index usage and bloat over time in a production environment?