Hi @madala ,
It is weird, I usually export to Excel with this many records without problems.
There are three things you could try however:
1. Use smaller chunks
I usually cap my chunks at 500 records, it will use less memory per chunk.
2. Use a dedicated Export
Take a look in the LaravelExcel docs:
https://laravel-excel.maatwebsite.nl/3.1/exports/
I don't know if it is more optimized than using direct export, but I always used a dedicated Export.
I can't tell if it is going to help, I would need to do some benchmarking to be sure. But, in my opinion, the benefits are the separation of concerns and easier configuration.
3. Offload the export the queued job.
I also use direct export just like you, but for larger exports I generally offload the export to a queued Job.
A queued job will run using the CLI php.ini settings. Usually there is no time limit and the memory is configured to a larger size than in the web process.
An approach I use is to create an exports table with a status column, so during the request I mark it as pending, when the job starts I update it to processing and wen it finishes I update it to completed (or error if an exception happened).
With the aid of this auxiliary table I can update the front end accordingly.
Hope it helps.