Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

madala's avatar

big export using Laravel Excel maatwebsite/excel

Hi, I'm exporting data which has a total record of 10000. i'm getting error unable to handle the request HTTP ERROR 500.

HERE IS NY CODE

`` public function downloadExcel(Request $request) {

    $status = $request->get('status') ? $request->get('status') : "";
    $bfy = $request->get('bfy') ? $request->get('bfy') : "";
    $progCat = $request->get('progCat') ? $request->get('progCat') : "";
    $progType = $request->get('progType') ? $request->get('progType') : "";
    $progTrn = $request->get('progTrn') ? $request->get('progTrn') : "";




    $query = DB::table('subaward_detail_report_v')->select(['subaward_detail_report_v.*']);

    if ($status != "" && $status != "All") {

        $query = $query->where('active_fl', '=', $status);

    }
    if ($bfy != "") {

        $query = $query->where('fa_bfy', '=', $bfy);

    }
    if ($progCat != "") {

        $query = $query->where('program_cat', '=', $progCat);

    }
    if ($progType != "") {

        $query = $query->where('program_type', '=', $progType);

    }
    if ($progTrn != "") {

        $query = $query->where('program_trn_sb734', '=', $progTrn);
    }

    $query = $query->orderBy('fa_bfy','DESC');

  
    ini_set('max_execution_time',600);
    ini_set('memory_limit',"4096");



    Excel::create('Report', function ($excel) use ($query) {
        $excel->sheet('report', function ($sheet) use ($query) {
            $sheet->appendRow($this->columns());
            $query->chunk(1000, function ($rows) use ($sheet) {
                foreach ($rows as $row) {
                    $sheet->appendRow($this->rows($row));

        }
            });
        });
    })->download('xlsx');
}

I increased the execution time and memory limit still i'm getting the same issue. can any one help me with this.

Thanks, Balaram

0 likes
7 replies
rodrigo.pedra's avatar

Hi @madala ,

It is weird, I usually export to Excel with this many records without problems.

There are three things you could try however:

1. Use smaller chunks

I usually cap my chunks at 500 records, it will use less memory per chunk.

2. Use a dedicated Export

Take a look in the LaravelExcel docs:

https://laravel-excel.maatwebsite.nl/3.1/exports/

I don't know if it is more optimized than using direct export, but I always used a dedicated Export.

I can't tell if it is going to help, I would need to do some benchmarking to be sure. But, in my opinion, the benefits are the separation of concerns and easier configuration.

3. Offload the export the queued job.

I also use direct export just like you, but for larger exports I generally offload the export to a queued Job.

A queued job will run using the CLI php.ini settings. Usually there is no time limit and the memory is configured to a larger size than in the web process.

An approach I use is to create an exports table with a status column, so during the request I mark it as pending, when the job starts I update it to processing and wen it finishes I update it to completed (or error if an exception happened).

With the aid of this auxiliary table I can update the front end accordingly.

Hope it helps.

1 like
juanca2626's avatar

@rodrigo.pedra Hello, I am using the queues to import 450 thousand records. when I do the direct import it takes 8 min to create the file but when I want to save it in the storage it takes 1 hour and I keep thinking what could be the reason?

delay 8min return Excel::download(new RowsExport(), 'file.xlsx');

It never ends, it finishes creating the file, it takes more than 1 hour and nothing (new RowsExport())->store('/export/file.xlsx', 'public');

Both are the same files.

rodrigo.pedra's avatar

One more thing: Did you check the logs to see the exact exception it is throwing when erroring?

Error 500 is not always about timeout.

Sorry if you already checked it, I re-read your post and couldn't find something about it.

Snapey's avatar

If you are on 5.7 I would use the dump server and put some diagnostics in the chunk loop so you can see how far its getting

1 like
damms005's avatar

"A queued job will run using the CLI php.ini settings. Usually there is no time limit and the memory is configured to a larger size than in the web process"

...if you are not on a shared hosting :p)

rgrissinger's avatar

I got mine to work (~400,000 rows) with laravel-excel using the generator() method and laravel's cursor() method. Here is a greatly simplified example:

public function generator(): \Generator
{
     $query = Deal::select(
        'id',
        'deal_id',
        'dealer_id')
    ->cursor();

    foreach ($query as $deal) {
        Log::channel('export')->info('Exporting deal', ['deal' => $deal->id]);
        yield [
            $deal->id,
            $deal->deal_id,
            $deal->dealer_id
		];
	}
}

Read more here: https://docs.laravel-excel.com/3.1/exports/from-generator.html

1 like

Please or to participate in this conversation.