Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

pratikkaje's avatar

Correct approach to import data into mysql from large csv. (rows > 100k)

I am trying to import a large csv into mysql database. I am using phpleague csv to read data from csv. I have created migration and model files to import data into database. For testing purposes I wrote my import logic into web.php file. The first issue I faced while importing was "Fatal error: Maximum execution time of 30 seconds exceeded". I resolved this issue by adding "ini_set('max_execution_time', 300);" as first line of code in my php file. After this, the second error I faced is "ngix time-our error 504". I googled for solution and found answers suggesting change in various config files.

Is there any better approach to import large data that I can follow rather than changing config files? Thanks.

0 likes
4 replies
Sergiu17's avatar

http://php.net/manual/en/language.generators.overview.php

Calling range(0, 1000000) will result in well over 100 MB of memory being used.

As an alternative, we can implement an xrange() generator, which will only ever need enough memory to create an Iterator object and track the current state of the generator internally, which turns out to be less than 1 kilobyte.

http://mark-story.com/posts/view/php-generators-a-useful-example

http://www.sebastianviereck.de/en/ver-abreiten-large-files-with-php-using-generator/

Hope this examples will help you, short answer is: use generators!

// This example I think is fine!
<?php
class CsvFile {

    protected $file;

    public function __construct($file) {
        $this->file = fopen($file, 'r');
    }

    public function parse() {
        while (!feof($this->file)) {
            yield fgetcsv($this->file);
        }
        return;
    }
}

$csv = new CsvFile('/pathTo/file.csv');
foreach ($csv->parse() as $row) {
    echo $row;
    // INSERT INTO DATABASE 
}
cmdobueno's avatar
Level 18

I generally will write these are commands.

That beings said, I also manually run these commands. I have never had much of a reason to allow a user to import that many records to the system.

BUT if that is required, you could also enable queues and workers, and register the import as a job.

I have never had an issue with these two concepts timing out. You are dealing with a long running process, which should probably be done fully on the server side, nginx should not be awaiting a result.

pratikkaje's avatar

@cmdobueno Thanks for your suggestion. I am looking at Laravel Horizon. Do you think that would be the right fit for the task?

cmdobueno's avatar

Ummmm sure I guess?

I suggest reading:

https://laravel.com/docs/5.7/queues

before going down any path. Laravel queues are literally built into laravel. All you need to do is install something like supervisor to run them all the time...

though, like I said read the docs and pick which ever solution you feel works best for you. I generally do not use outside services, be it free or paid, when I can accomplish the same thing from within my own server.

Please or to participate in this conversation.