FutureWeb
4 years ago

Handling massive record sets

Posted 4 years ago by FutureWeb

Hi LaraPeeps,

I have a class that imports large amounts of data this use to be done by importing directly from the csv feeds and calling functions to sort and separate the data into different db tables.

I am now trying to move this to dumping the entire feed into a raw_datas table then loop through the records and insert into the various tables I need. So far the load data in file works great and my functions to insert the data also works well except when I try to work with lots of records it seems to handle 5k ok but the raw_datas table holds up to 1.7 million rows at a time depending on what deal type I am importing.

The whole thing needs to run as a cron job daily so I need a way of importing all the data in chunks to stop it falling over. I know the server is capable of importing all the records in one hit as I my old vanilla php import code works fine.

the process works by the script calling a service which generates the datafeed and then in turn hits a url on my server telling it the feed is ready for download next my script fetches the feed saves it to the server and unzips it and then dumps it in to the raw_datas table.

there are 4 csv's generated and absorbed in this way and they range in size from 385k - 1.2gb

can this be done under laravel or should I sack it and go back to old school php for the import? if it can any ideas on splitting the import down into chunks? I know I can paginate the results is there a way I can get the function to call it's self when it reaches the last ?

Thanks in advance

Please sign in or create an account to participate in this conversation.