ssquare
9 months ago
153
7
Laravel

Best way to handle huge insertion in laravel considering both timeout and memory issue

Posted 9 months ago by ssquare

Currently, I am trying to insert data to the database using Laravel Queue. This is importing data from csv formatting some columns and the finally dump to the database.

The size of this csv is so huge that it could take more than 24 hours or even more as it has more than 500k rows of data.

Previously, I thought this queue is going to work with no time limits and memory but it does not seem to work in that way.

I am using LeagueCSV and reading csv rows on chunk until the read rows are not empty. This is reading data in for each loop.

How can I deal with this, Let me know if you need any more information?

Let me know if there is a way to break job connection and reconnect and work from next chunk removing this timeout and out of memory issue.

Please sign in or create an account to participate in this conversation.