@gabriel27 Would you not be better off creating SQL import scripts, rather than trying to read data into PHP, to write it back out to a new database?
Parse big database to convert data from old system to Laravel
Hello,
I have a PHP LMS. I created a new Laravel based LMS. I have a big MySQL database with millions of records devided in tens of tables.
The table structure is completely different so I need to query the old database, process data and insert the data in the new structure.
I split the data in tables. The scripts timed out.
I also split the data in the tables in years but still timed out.
I created in Laravel 2 MySQL connections (one to the old DB and one to the new DB). I created jobs and chained them to connect to each DB. If I try to parse and insert over 80k rows the queue is killed.
I may have tried to save the last ID saved correctly and using transactions to restart where it left of but I believe is quite complicated considering the number of tables.
The only solution that worked is to create external PHP scripts, export the data from old DB in jsons split on tables and years with the structure of the new table and then import them in Laravel using chained queues and commands. This way we also split the data conversion in one execution and the import in another execution.
Are there any better ways to parse and convert mysql data with a high number of rows?
Thank you.
Please or to participate in this conversation.