Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

Solvando's avatar

Parallel requests with Guzzle Async

I'm using Elastic Search and I have a size of 15TB there. A simple search in a single index lasts 10 seconds. I want to make 30 request, each for an index.

If I'm using Guzzle Asynchronous Requests, why the requests takes minutes? The async is not really async? Why it doesn't return all the requests fast supossed those are called in parallel?

Is maybe about Elastic search requests per server?

0 likes
3 replies
franciscool004's avatar

Can you share some of the code you implemented?, maybe you are not handling concurrency correctly. Also maybe that's someting to handle with Laravel jobs and queues.

Solvando's avatar

Hello, @franciscool004 !

This is my function. So the time is the same slowly even I do the requests in a concurrent way. For 30 requests last minutes. My goal is to send 30 requests once, like from 30 different computers and to load the content in 10 seconds, like for a single request.

<?php

use GuzzleHttp\Promise;
use GuzzleHttp\Client;

class Elasticsearch
{
    
    public static function getDataByHash($hash)
    {
        $client = new Client(['base_uri' => 'My ELASTIC SEARCH URL]);

        //here will be 30 requests on 30 different indexes
        $promises = [
            'data1' => $client->getAsync('/index01/_search?q=hash:'.$hash),
            'data2'   => $client->getAsync('/index02/_search?q=hash:'.$hash),
            'data3'  => $client->getAsync('/index03/_search?q=hash:'.$hash),
            'data4'  => $client->getAsync('/index04/_search?q=hash:'.$hash)
        ];

        $responses = Promise\settle($promises)->wait();
        $convertToJson = json_decode($responses['data1']['value']->getBody());
        $convertToJson = json_encode($convertToJson);

        echo $convertToJson;
    }

}

I hope there is just a wrong approach from me. :)) Thank you!

alanTWebC's avatar

@solvando did you find a solution for the problem? I'm thinking about refactoring a synchronous request into several async, but I'm not sure if I'm actually going to save computational time. Thanks

Please or to participate in this conversation.