La paginación PHP MySQL es lenta

Mi mesa

Field   Type    Null    Key Default Extra
id      int(11)     NO  PRI NULL    auto_increment
userid  int(11)     NO  MUL NULL     
title   varchar(50) YES     NULL     
hosting varchar(10) YES     NULL     
zipcode varchar(5)  YES     NULL     
lat     varchar(20) YES     NULL     
long    varchar(20) YES     NULL     
msg    varchar(1000)YES MUL NULL     
time    datetime    NO      NULL     

That is the table. I have simulated 500k rows of data and deleted randomly 270k rows to leave only 230k with an auto increment of 500k.

Here are my indexs

Keyname Type    Unique  Packed  Field   Cardinality Collation   Null
PRIMARY BTREE   Yes No  id            232377    A       
info    BTREE   No  No  userid          2003    A       
lat                                    25819    A   YES
long                                   25819    A   YES
title                                  25819    A   YES
time                                   25819    A   

With that in mind , here is my query:

SELECCIONAR * DESDE posts ¿Dónde? long>-118.13902802886 AND long<-118.08130797114 AND lat> 33.79987197114 Y lat<33.85759202886 ORDER BY id ASC LIMIT 0, 25

Showing rows 0 - 15 (16 total, Query took 1.5655 sec) [id: 32846 - 540342]

The query only brought me 1 page, but because it had to search all 230k records it still took 1.5 seconds.

Here is the query explained:

id  select_type table   type    possible_keys   key     key_len ref rows    Extra
1   SIMPLE      posts   index   NULL            PRIMARY 4       NULL 25     Using where

So even if i use where clauses to only get back 16 results I still get a slow query.

Now for example if i do a broader search :

SELECT * FROM `posts` WHERE `long`>-118.2544681443 AND `long`<-117.9658678557 AND `lat`>33.6844318557 AND `lat`<33.9730321443 ORDER BY id ASC LIMIT 0, 25

Showing rows 0 - 24 (25 total, Query took 0.0849 sec) [id: 691 - 29818]

It is much faster when retrieving the first page out of 20 pages and 483 found total but i limit to 25.

but if i ask for the last page

SELECT * FROM `posts` WHERE `long`>-118.2544681443 AND `long`<-117.9658678557 AND `lat`>33.6844318557 AND `lat`<33.9730321443 ORDER BY id ASC LIMIT 475, 25

Showing rows 0 - 7 (8 total, Query took 1.5874 sec) [id: 553198 - 559593]

I get a slow query.

My question is how do I achieve good pagination? When the website goes live I expect when it takes off that posts will be deleted and made daily by the hundreds. Posts should be ordered by id or timestamp and Id is not sequential because some records will be deleted. I want to have a standard pagination

1 2 3 4 5 6 7 8 ... [Last Page]

preguntado el 28 de agosto de 12 a las 10:08

Could you add information about the storage engine and indexes? -

I have indexed every column. And it's mysam engine. -

@CrisG do you want a bounty put on this question? eggyal's answer looks pretty good, but you never responded to his final question in the comments. I'd be willing to put a bounty on this for you if you need it answered (once it's open for a bounty). -

I have updated the question . -

@Matt sure I want a bounty on this question. -

7 Respuestas

Filter from your results records which appeared on earlier pages by using a WHERE clause: then you do not need to specify an offset, only a row count. For example, keep track of the last id or timestamp seen and filter for only those records with id or timestamp greater than that.

Respondido 28 ago 12, 10:08

I have seen that solution but it only allows me to have next page or previous . - c3cris

@CrisG: How stable is the data? - huevo

@eggyall I do not understand your question. - c3cris

@CrisG: How often is the table written to (relative to the frequency with which it is read)? - huevo

I have updated the question that should answer your question with my examples. - c3cris

unfortunately mysql has to read [and earlier sort] all the 20000 rows before it outputs your 30 results. if you can try narrowing down your search using filtering on indexed columns within WHERE clause.

Respondido 28 ago 12, 11:08

Few remarks.

given that you order by id, it means that on each page you have id for first and last record, so rather than limit 200000, you should use where id > $last_id limit 20 and that would be blazingly fast.

drawback is obviously that you cannot offer "last" page or any page in between, if id's are not sequential (deleted in between). you may then use combination of the last known id and offset + limit combination.

and obviously, having proper indexes will also help sorting and limiting.

Respondido 28 ago 12, 11:08

it looks like you only have a primary key index. you might want to define an index on the fields you use, such as:

create index idx_posts_id on posts (`id` ASC);
create index idx_posts_id_timestamp on posts (`id` ASC, `timestamp` ASC);

having a regular index on your key field, besides your primary unique key index, usually helps speed up mysql, by, A LOT.

Respondido 28 ago 12, 11:08

Mysql loses quite a bit of performance with a large offset: from the mysqlPerformance blog:

Beware of large LIMIT Using index to sort is efficient if you need first few rows, even if some extra filtering takes place so you need to scan more rows by index then requested by LIMIT. However if you’re dealing with LIMIT query with large offset efficiency will suffer. LIMIT 1000,10 is likely to be way slower than LIMIT 0,10. It is true most users will not go further than 10 page in results, however Search Engine Bots may very well do so. I’ve seen bots looking at 200+ page in my projects. Also for many web sites failing to take care of this provides very easy task to launch a DOS attack – request page with some large number from few connections and it is enough. If you do not do anything else make sure you block requests with too large page numbers.

For some cases, for example if results are static it may make sense to precompute results so you can query them for positions. So instead of query with LIMIT 1000,10 you will have WHERE position between 1000 and 1009 which has same efficiency for any position (as long as it is indexed)

Respondido 28 ago 12, 11:08

yes but what if posts are deleted and your missing 1005-1007 .. and u want 10... u will only receive only 7 results not 10. - c3cris

If you are using AUTO INCREMENT you may use:

SELECT * FROMmensajes WHEREid>= 200000 ORDER BYidDESC LIMIT 200000 , 30

This way mysql will have to traverse only rows above 200000.

Respondido 28 ago 12, 11:08

but even if rows are deleted, auto increment counter never goes down! - fanático del código

por ejemplo: 1 ------ record a ---------- detail a 2--------record b ---------- detail b if record 2 is deleted and a new record is inserted data will be: ` 1 ------ record a ---------- detail a 3--------record c ---------- detail c` - fanático del código

I figured it out. What was slowing me down is order by. Since I would call a limit and the the further down I asked to go the more it had to sort. So then i fixed it by adding a subquery to first extract the data I want with WERE clause then I used ORDER BY y LIMIT

SELECT * FROM 
    (SELECT * from `posts` as `p` 
        WHERE 
           `p`.`long`>-119.2544681443 
           AND `p`.`long`<-117.9658678557 
           AND `p`.`lat`>32.6844318557 A
           ND `p`.`lat`<34.9730321443  

    ) as posttable 
    order by id desc 
    limit x,n

By doing that I achieved the following:

id  select_type     table        type   possible_keys   key key_len ref     rows    Extra
1   PRIMARY         <derived2>   ALL    NULL            NULL NULL   NULL    3031    Using filesort
2   DERIVED         p            ALL    NULL            NULL NULL   NULL    232377  Using where

Now I filter 232k results using "where" and only orderby and limit 3031 results.

Mostrando filas 0 - 3030 (3,031 en total, la consulta tomó 0.1431 segundos)

Respondido 29 ago 12, 00:08

No es la respuesta que estás buscando? Examinar otras preguntas etiquetadas or haz tu propia pregunta.