Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

transporindo-pusat's avatar

Why does exporting 92k rows of data using PHPSpreadsheet not leak memory, even with unlimited execution time and memory?

I'm working on a data export functionality in Laravel and need to export a large dataset, approximately 92,000 rows. I'm using PHPSpreadsheet for the export, and I’ve explicitly set the following PHP configurations to avoid any limitations during the export process:

  • max_execution_time = -1 (no execution time limit)

  • memory_limit = 0 (no memory limit)

Despite this, I’ve noticed that the memory usage does not increase significantly during the export, and there’s no memory leak, which is unexpected given the large dataset.

I expected that exporting such a large number of rows without using streaming would lead to high memory usage or even cause memory leaks, but that doesn’t seem to be the case.

Here’s what I’ve tried so far:

  • I’m using PHPSpreadsheet for the export process.

  • I have set max_execution_time = -1 and memory_limit = 0 to remove any limits.

= I haven’t implemented chunking or streaming in the process, and I’m not using the stream concept for exporting.

Can anyone explain why this is happening? Is there something in the way PHPSpreadsheet or Laravel handles large exports that prevents memory leaks in this scenario, or am I missing something?

0 likes
4 replies
Glukinho's avatar

Two options:

  1. 92k rows is not something fantastic, depending on row size they can all be in memory without any problems. I used PhpSpreadsheet on similar sized XLSX files on 2 Gb RAM (+2 Gb swap) virtual machine without any optimizations/streaming/chunking at all, no problems.

  2. PhpSpreadsheet was developed with accurate memory handling in mind and uses chunking implicitly where it can.

BTW, it's strange to ask why something IS NOT broken and works fine, usually people ask about why it IS broken :)

transporindo-pusat's avatar

@glukinho

So, the conclusion is PHPSpreadsheet already handled this right? Because i just wanna compare that must using Stream or not for my export to excel

That's why when test, why not broken and works fine.

But, how if i'm using LaravelExcel? it be same with PHPSpreadsheet?

And, why when get() the data in Laravel query builder 92k. The HTTP Response is 500 but not showed error?

Snapey's avatar

@transporindo-pusat I suggest you read about chunking

LaravelExcel is a laravel specific wrapper around PhpSpreadsheet, so I would expect them to behave similarly

Why does get() fail when trying to load all rows? Because you are trying to load all data into memory at the same time.

The HTTP Response is 500 but not showed error?

If you ever want to know what caused a 500 error, look in the log files, although reporting can be missing in out of memory conditions.

Glukinho's avatar

Laravel-excel uses PhpSpreadsheet, so yes, I think it will be the same.

why when get() the data in Laravel query builder 92k. The HTTP Response is 500 but not showed error?

I don't know your data and your code, so I can't tell why you have 500 http error. This can be not about memory leakage at all. Share your code maybe?

As a general rule, if it works simple way, leave it simple. In other words, start thinking about optimizations when you face issues, not before.

Please or to participate in this conversation.