multi curl requests

Multi cURL requests | Reduce curl request time tremendously

Are you facing issues with cURL requests taking too much time even some times you got an error ‘request time out’ or ‘Maximum execution time exceed’. This may be because you are requesting a server for N number of times.
You can solve this by Multi cURL requests which reduce curl request time tremendously.

It might be the case that you need to request a server for a ids of array, in this case the total time required is the sum of the execution time of each request which definitely reduce the loading time.

Multi cURL Request

Simultaneous cURL request or Multi cURL request are processed simultaneously or in parallel instead of one request after other. curl_multi_init allows the processing of multiple cURL handles asynchronously and return a cURL multi handle resource on success, FALSE on failure.

You can perform multiple simultaneous cURL request using curl_multi_init.

curl_multi_exec processes each of the handles in the stack. This method can be called whether or not a handle needs to read or write data.
Ex: curl_multi_exec ( resource $mh , int &$running ) : int
In which –
$mh A cURL multi handle returned by curl_multi_init().
$running reference to a flag to tell whether the operations are still running.

We can reduce cURL response time by using curl_multi_exec

But what if instead of making request one after another we can make request simultaneously. Let’s take an example if you have to make 10 request then if you make multi curl request. our execution time is reduce by almost 10 times(approx.) or we can say it is equal to the slowest request.
In this way response time of cURL requests is reduce effectively. using curl_multi_exec response time reduces significantly.

Let’s understand cURL and Multi cURL requests by using an Example

You need to make 100 request, you have an array of 100 ids. Now, if you use cURL then you need to iterate loop for each id and request server therefore the average response time will be 100 times of a single request.

<?php
$ids = [100,101,102,103,....]; // array of ids or which we need to fetch records from server. 
$response = [];
// server url
$server_url = 'https://worthread.in&id=';
foreach ($ids as $id) {  
  $ch = curl_init();
  curl_setopt($ch, CURLOPT_URL, $server_url.$id); // adding request id in server url
  curl_setopt($ch, CURLOPT_HEADER, TRUE); 
  curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
  $response[] = curl_exec($ch);
  $httpCode = curl_getinfo($ch, CURLINFO_HTTP_CODE);// response code
  curl_close($ch);
}

In the above example you use a foreach loop to make a request to the server for each id. Now, is there any solution for the issue exists in single cURL request, the answer is multi cURL but PHP manual do not have proper documentation for curl_multi_exec so let’s check it using the example of Multi cURL requests.

By using curl_multi_exec, you can execute requests in parallel hence reduce time as request processed simultaneously, and we are slow only by the slowest request.

<?php
    $ids = [100,101,102,103,....]; // array of ids or which we need to fetch records from server.
    $response = [];
    // server url
    $server_url = 'https://worthread.in&id=';
    // array of curl handles
    $multiCurl = array();
    // multi handle
    $mh = curl_multi_init();
    //loop through ids to process    
    foreach ($ids as $i => $id) {          
        $multiCurl[$i] = curl_init();
        curl_setopt($multiCurl[$i], CURLOPT_URL,$server_url.$id);
        curl_setopt($multiCurl[$i], CURLOPT_HEADER, TRUE);
        curl_setopt($multiCurl[$i], CURLOPT_RETURNTRANSFER,1);
        curl_multi_add_handle($mh, $multiCurl[$i]);
    }
    $active = null;
    do {
        $status = curl_multi_exec($mh, $active);
    } while ($status == CURLM_CALL_MULTI_PERFORM);
    // get response and remove handles
    foreach($multiCurl as $k => $ch) {
        $response[$k] = curl_multi_getcontent($ch);
        curl_multi_remove_handle($mh, $ch);
    }
    // close
    curl_multi_close($mh);
?>

The response of above curl_multi_exec will return in our $response variable and you can use it further as per your application logic.

Disadvantage of curl_multi_exec and its solution

Ok, we talk enough about curl_multi_exec and its advantages and how it reduce request time but wait ‘Every Great Things, Comes With a Cost’ same is the case with curl_multi_exec.

If you loop all ids in above example or we can say if you are executing a large array using curl_multi_exec there are chances that we loss some request.

Solution for request loss in curl_multi_exec

Yes, we have a solution for this disadvantage of curl_multi_exec, the problem is if execute large number of request then we face problem.

so, can we limit our request and is there any difference in cURL and multi cURL after limiting the request? Yes there will, first check the example then I will explain it.


This code example is just for example purpose this article, you need to modify it according to your logic.

<?php
    $ids = [100,101,102,103,....]; // array of ids or which we need to fetch records from server.
    $response = [];
    // server url
    $server_url = 'https://worthread.in&id=';
    // array of curl handles
    $multiCurl = array();
    // multi handle
    $mh = curl_multi_init();
    // adding window to limit simultaneous request
    $window = 20;
    if (count($ids) < $window) {
        $window = count($ids); // if ids are less then window size, set window size as of ids size
    }
        
    //loop through ids to process | limited by window size   
    for ($i = 0; $i < $window; ++$i) { 
        $multiCurl[$i] = curl_init();
        curl_setopt($multiCurl[$i], CURLOPT_URL,$server_url.$ids[$i]);
        curl_setopt($multiCurl[$i], CURLOPT_HEADER, TRUE);
        curl_setopt($multiCurl[$i], CURLOPT_RETURNTRANSFER,1);
        curl_multi_add_handle($mh, $multiCurl[$i]);         
    }
    $running=null;
    do {
        $status = curl_multi_exec($mh,$running);// execute multi cURL
    } while($status === CURLM_CALL_MULTI_PERFORM || $running);
    // get response and remove handles
    while ($running && $execrun === CURLM_OK) {
        //the loop is infinite forever from here
        if (curl_multi_select($mh) !== -1) {
            do {
                $execrun = curl_multi_exec($mh, $running);
            } while ($execrun === CURLM_CALL_MULTI_PERFORM);
        }
        if ($execrun !== CURLM_OK) break;
        while ($done = curl_multi_info_read($mh)) {
            $output = curl_multi_getcontent($done['handle']);// get content

            if ($output) {
                $response[] = $output;
            }
                // if tehre are request keft add them 
            if (isset($i) && $i < count($ids)) {
                $multiCurl[$i] = curl_init();
                curl_setopt($multiCurl[$i], CURLOPT_URL,$server_url.$ids[$i]);
                curl_setopt($multiCurl[$i], CURLOPT_HEADER, TRUE);
                curl_setopt($multiCurl[$i], CURLOPT_RETURNTRANSFER,1);
                curl_multi_add_handle($mh, $multiCurl[$i]);
                ++$i; // increment i
            }

            curl_multi_remove_handle($mh, $done['handle']);
            curl_close($done['handle']);
        }
    }
    // close
    curl_multi_close($mh);
?>

So, in above example we only add 20 request first and then check if any request completed using curl_multi_info_read if request is done we then add next request.

By this we only have $window request at a time. This is require in case there are many request and some server limitations are there.

1 thought on “Multi cURL requests | Reduce curl request time tremendously”

  1. Hi Anjaneya Acharya,
    I’ve tried your script, but unfortunately I don’t get any output, when using this code at the end of your code:

    echo “”;
    echo var_export($response,true);
    echo “”;

    Kind regards
    Andreas

Leave a Comment

Your email address will not be published. Required fields are marked *

%d bloggers like this: