How do you suppose this infinite loop is affecting your experience?
while(true) {
// do HTTP request and other stuff
}
Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.
Good day everybody. I'm trying to understand what's the proper way to make a paginated call to an external endpoint without having PHP crashing everytime because of the lack of memory. I'm using PHP 8.3.0, with Laravel 11.31, and I'm making a "simple" HTTP GET call to an external endpoint. Here below, the code I'm using:
public function handle()
{
$startTime = now();
$companies = [
11,
12,
13,
14,
92,
128,
137
];
$startDate = '2025-01-01';
$endDate = '2025-01-31';
$limit = 50;
$year = 2025;
$month = 1;
$url = config('services.external_endpoint') . '/company/overall';
foreach($companies as $company) {
$this->info("Processing company: $company");
$cacheKey = "overall:company:$company:year:$year:month:$month";
$this->info("cacheKey: $cacheKey");
if(Redis::exists($cacheKey)) {
$this->info("The cache key already exists, hence it won't be processed.");
continue;
}
$dbCompany = Company::find($company);
$apiKey = $dbCompany->api_key;
$apiSecret = $dbCompany->api_secret;
$page = 0;
while(true) {
$queryParameters = [
'start_date' => $startDate,
'end_date' => $endDate,
'page' => $page,
'limit' => $limit,
'api_key' => $apiKey,
'api_secret' => $apiSecret
];
$response = Http::acceptJson()
->timeout(300)
->get(
url: $url,
query: $queryParameters
)
->throw()
->json();
$results = $response['results'] ?? [];
if(empty($results)) break;
$json = json_encode($results);
Log::info("Page $page string size (in MB): " . (strlen($json)/1024/1024));
Log::info('Memory usage (in MB): ' . (memory_get_usage(true)/1024/1024));
Redis::hSet($cacheKey, "page:{$page}", $json);
$page++;
}
Log::info("Processed $page pages, moving on.");
}
$endTime = now();
$this->info("Start Time: {$startTime}");
$this->info("End Time: {$endTime}");
}
I'm using Redis as a cache store because I wanted to test it out, since every page has a size of 1+ MB and the data final size is over 800+ MB (on disk, with SQLite and serialization), while it's around 650 MB with Redis (with json_encode). Please note that I wrapped this logic in a command just for testing. Everytime I launch this command, I just see the memory increasing until it reaches the PHP default limit of 128 MB. I was taking a look at different ways how to implement it, and I'm trying to figure it out which one can be the best. The original logic is wrapped in a Job, and even there the PHP memory limit is increased to 512 MB, but I don't want to use it as a solution, since I believe there are better ways to achieve this. Thank you in advance for whoever is going to step in.
Kind regards.
Anyway, I did my researches and I've found the Http::pool() method, and I used that to make concurrent paginated call.
Please or to participate in this conversation.