Ajuste de postgresql (para lecturas rápidas con django)

we have a django & postgresql setup running on ec2. Our application is always writing to the DB in the background - but this is not initiated from user action.

The problem is that when a user does use the system we need to do a great big read, sometimes with full text search, of around 20k items. Any tips on tuning for this scenario??

preguntado el 10 de marzo de 12 a las 00:03

Post you schema, adding additional indexes / keys is the simplest. By that I mean creating indexes for common filtering fields etc. -

I have successfully improved postgresql performance on EC2 by following some of the tips in this video: blip.tv/djangocon/secrets-of-postgresql-performance-5572403 but as j_mcnally says, we need to know more about the queries you are trying to optimize to give concrete advice. -

blog.it-agenten.com/2015/04/tuning-django-orm-text-queries - I have written down some performance tuning steps especially for Django + Postgres in that blog article. -

1 Respuestas

20k items is not that big a read. :)

On EC2, the main things to do are:

  1. Get as much memory as you can rationally afford; EBS performance is terrible, and you want as much cache as you can manage.
  2. Make sure your shared_buffers setting is correct; 25% of available RAM is a good starting point.
  3. Look at the big read with EXPLAIN ANALYZE to look for opportunities to create indexes (but don't just create indexes without a practical reason; they're expensive if they are not being used for anything).
  4. If changing EBS configuration is an option, consider moving to an 8-stripe soft-RAID configuration.

respondido 17 mar '12, 22:03

No es la respuesta que estás buscando? Examinar otras preguntas etiquetadas or haz tu propia pregunta.