¿PHP es la mejor manera de enviar solicitudes de publicación a cientos de sitios?

I've tried using Rolling Curl, Epi Curl, and other PHP multi curl solutions that are out there, and it takes an average of 180 seconds to send post requests to JUST 40 sites and receive data (I'm talking about receiving just small little success/fail strings) from them, that is dog slow!!!

It only does well with 1 post request which is like 3-6 seconds and I don't even know if that's even good because I see others talking about getting 1 second responses which is crazy.

I've also tried using proc_open to run linux shell commands (curl, wget) but that is also slow as well, and not server friendly.

What I'm pretty much trying to do is a Wordpress plugin that is able to manage multiple Wordpress sites and do mass upgrades, remote publishings, blogroll management, etc. I know that there is a site out there called managewp.com, but I don't want to use their services because I want to keep the sites I manage private and develop my own. What I notice about them is that their request/response is ridiculously fast and I am just puzzled at how they're able to do that, especially with hundreds of sites.

So can someone please shed light how I can make these post requests faster?


I've been doing some thinking and I asked myself, "What is so important about fetching the response? It's not like the requests that get sent don't get processed properly, they all do 99% of the time!"

And so I was thinking maybe I can just send all the requests without getting the responses. And if I really want to do some tracking of those processes and how they went, I can have those child sites send a post request back with the status of how the process went and have the master site add them into a database table and have an ajax request query like every 10 seconds or so for status updates or something like that.. how does that sound?

preguntado el 28 de agosto de 11 a las 00:08

The might not be using PHP for the update service (multithreaded with Java)? Also you give the PHP extensión http a try (it also supports concurrent queries). -

2 Respuestas

cUrl takes about 0.6 - 0.8 seconds per request

So for about 500 website it could take from 300 to 400 seconds.

You could whip this through a loop.

$ch = curl_init(); // Init cURL

curl_setopt($ch, CURLOPT_URL, "http://www.example.com/post.php"); // Post location
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); // 1 = Return data, 0 = No return
curl_setopt($ch, CURLOPT_POST, true); // This is POST

// Our data
$postdata = array(
    'name1' => 'value1',
    'name2' => 'value2',
    'name3' => 'value3',
    'name4' => 'value4'

curl_setopt($ch, CURLOPT_POSTFIELDS, $postdata); // Add the data to the request

$o = curl_exec($ch); // Execute the request

curl_close($ch); // Finish the request. Close it.

This also depends on your connection speed. From a datacenter it should be fine, if your testing from a home location it might give not-as-good results.

Respondido 28 ago 11, 05:08

It's not about how long cURL takes but how long servers take to handle the request. Take into consideration the fact that some URLs don't even exist anymore and if DNS resolving fails, that may take up to a minute. - EarnestoDev

I'm currently working on a project download hundreds of URLs at a time with PHP and curl_multi. . Do Hacer batches of up to 250 URLs y jugar con CURLOPT_TIMEOUT y CURLOPT_CONNECTTIMEOUT to refine your code's speed.

I have a cURL class (2500+ lines) handling all cURL magic including multi and straight to file downloads. 250 URLs / 15-25 seconds using decent timeouts. (But I'm not sharing it for free...)

PS: Downloading that many URLs would require using temporary files as cURL download targets and not memory. Just a thought...

Respondido 28 ago 11, 06:08

Oh, cute! I told him how to do it... didn't I? But code I put many hours of work in don't come 43. - EarnestoDev

No es la respuesta que estás buscando? Examinar otras preguntas etiquetadas or haz tu propia pregunta.