Excel Data Too large to parse. Suggest hack to deal with large set of CSV Data

Posted 11 months ago by Inquisitive
public function parseImport(Request $request)
{   

    $this->validate($request,[
        'import_file' => 'required',
    ]);

    $path = $request->file('import_file')->getRealPath();
    $data = array_map('str_getcsv', file($path));
    $csv_header = array_slice($data, 0, 1);
    $csv_header = array_shift($csv_header);


    $db_header_obj = new Van_voter_record();
    $db_header = $db_header_obj->getTableColumns();

    if (count($full_csv_data) > 0) {
        $sample_data = $full_csv_data[0];
        $csv_data_file = Csv_data::create([
            'csv_filename' => $request->file('import_file')->getClientOriginalName(),
            'csv_header_flag' => $request->has('header'),
            'csv_header' => json_encode($csv_header),
            'csv_data' => json_encode($full_csv_data)
        ]);
    } else {
        return redirect()->back();
    }
    return view('admin.van_voter_record.import_fields', compact( 'csv_header', 'db_header', 'csv_data_file','sample_data'));


}

I am trying to store to the database by matching fields. This is just the first step of parsing and storing to a database as JSON data. I can successfully do it on a small CSV file. But, I am getting following error on large CSV.

Illuminate \ Http \ Exceptions \ PostTooLargeException No message

Currently, am testing with 50k row of data and each row have around 50 - 70 columns. But in a real scenario, the row can reach up to 500k.

I have heard about chunk method but am not pretty sure on how to do it. If any have an idea how can make a tweak to above-given code to deal with a large set of data set. Any help would be appreciated.

Please sign in or create an account to participate in this conversation.

Reply to

Use Markdown with GitHub-flavored code blocks.