Hello, I was wondering.. How would you go about importing a lot of data (e.g. thousands of records into the database) ? Also when you need to keep track of updates ond deletes of records from the source (another db, whatever that returns data in array). You need to do it periodicaly (like every 4 hours).. Whould you use eloquent or query builder? how would you handle updates? (e.g. mysql on DUPLICATE KEY UPDATE ?)
I tried something like this and it became a huge mess and took way to long to import the data let alone the updates.. Does anyone have any experience with something like this?
Depending on your table structure, they may be the best way to do what you're asking.
I typically write ~1k rows at a time per upsert when I'm doing large data imports. I also like to break it off into jobs, so if I'm streaming from a very large file, I'll grab 1k lines, pass those to a job, do any processing required then upsert the result.