Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

Kacadu's avatar

Data import

Hello, I was wondering.. How would you go about importing a lot of data (e.g. thousands of records into the database) ? Also when you need to keep track of updates ond deletes of records from the source (another db, whatever that returns data in array). You need to do it periodicaly (like every 4 hours).. Whould you use eloquent or query builder? how would you handle updates? (e.g. mysql on DUPLICATE KEY UPDATE ?)

I tried something like this and it became a huge mess and took way to long to import the data let alone the updates.. Does anyone have any experience with something like this?

Thanks a lot!

0 likes
1 reply
shawnveltman's avatar

I spent a couple of years working around this problem before I learned about Upserts. https://laravel.com/docs/9.x/eloquent#upserts

Depending on your table structure, they may be the best way to do what you're asking.

I typically write ~1k rows at a time per upsert when I'm doing large data imports. I also like to break it off into jobs, so if I'm streaming from a very large file, I'll grab 1k lines, pass those to a job, do any processing required then upsert the result.

1 like

Please or to participate in this conversation.