Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

nunodonato's avatar

Struggling to fix memory leaks

Hi folks, An app I'm working on has a console command that generates export files from a lot of data to json files (1.json, 2.json, 3.json etc...) I'm iterating over the main models using chunks, and then prepare the data using methods that fetch related models, build an array structure of that, and finally write everything to the file.

The problem is, I'm getting memory leaks that, due to the size of this data, make the script crash every time. I can run it fine if set memory_limit=-1 but obviously, that's not a good solution.

I've made sure all the queries are ok, I don't have N+1 problems, disabled the queryLog(is that still relevant?), for GC cleanup, but to no avail. I'm printing out the result from memory_get_usage() in many places along the script, and I see that it clears stuff when exiting methods, but still, in the main loop the memory increases by around 8mb in each iteration. I've used xdebug to setup a breakpoint in the main loop and check which variables are being used. But I can't find anything useful or that looks wrong.

Any pointers? I'm all out of ideas on where to go from here :/

0 likes
5 replies
LaryAI's avatar
Level 58

It sounds like you're dealing with a classic case of memory not being released during a long-running script. Here are a few strategies you can try to mitigate memory leaks in your PHP script:

  1. Unset Variables: After you're done with a variable, especially large arrays or objects, explicitly unset them to free up memory.
unset($largeVariable);
  1. Garbage Collection: Trigger garbage collection manually after a certain number of iterations.
if (($i % 100) === 0) {
    gc_collect_cycles();
}
  1. Optimize Eloquent: If you're using Laravel's Eloquent, make sure you're not inadvertently keeping models in memory.
  • Use ->toArray() or ->toJson() if you don't need the full model functionality.
  • Avoid using Eloquent methods that might load unnecessary data, like ->load() or ->with() if not needed.
  1. Chunking: You mentioned you're using chunking, which is great. You might want to experiment with the size of the chunks to find a balance between memory usage and performance.
Model::chunk(200, function ($models) {
    foreach ($models as $model) {
        // Process the model
    }
});
  1. PHP Configuration: Make sure you're not using a persistent connection to the database, as this can keep memory from being released.

  2. Dependency Injection: If you're using dependency injection, ensure that you're not unintentionally keeping references to large services or objects that could be causing memory leaks.

  3. Profiling: Use a tool like Blackfire or Xdebug to profile your application and find memory hotspots.

  4. Laravel Specific: If you're using Laravel's jobs or queues, make sure you're not dispatching jobs within your loop that could be eating up memory.

  5. Resetting the Model Cache: If you're using Eloquent, each model instance has its own internal cache. You can clear it by calling ->fresh() on the model instance.

  6. Database Cursors: If you're using a database cursor to iterate through results, make sure you're closing the cursor after you're done with it.

  7. PHP Version: Ensure you're using the latest PHP version, as each new version typically includes improvements to memory management.

Here's a simplified example of how you might implement some of these strategies:

use App\Models\YourModel;

$chunkSize = 200; // Adjust chunk size as needed
$iteration = 0;

YourModel::chunk($chunkSize, function ($models) use (&$iteration) {
    foreach ($models as $model) {
        // Process your model here
        $data = $model->toArray(); // Convert to array if you don't need the full model
        // Write to file or process data
        // ...

        // Clear the model's internal cache if you've accessed relationships
        $model->fresh();

        unset($model); // Unset the model to free up memory
    }

    // Manually trigger garbage collection after a certain number of iterations
    if (++$iteration % 10 === 0) {
        gc_collect_cycles();
    }
});

// Optionally, unset the models variable to free up memory
unset($models);

Remember to monitor the memory usage after implementing these changes to see if there's an improvement. If you're still experiencing issues, it might be worth looking into more advanced profiling to pinpoint the exact cause of the memory leaks.

Sergiu17's avatar

Try to temporarily remove the part that writes json files, run the script, and you will at least identify which part breaks the code.

If its writing part, then implement generators

nunodonato's avatar

@Sergiu17 no, its not the writing, I removed it completely, it still leaks in the same way

Please or to participate in this conversation.