MySQL & Scaling

Posted 4 years ago by youkaihiei

Not necessarily a Laravel question but hopefully someone with more experience could provide some insight into some issues I've been having with executing a whole bunch of SQL queries efficiently.

Basically my situation is this. This is on WordPress and WooCommerce on my Homestead 2 local environment. I have a CSV with roughly 10,500 rows - they're all just sets of skus and a stock number to be updated. The bit I put together to parse the CSV is pretty quick when I have it just echo out each sku, it takes between half a second to one second to finish.

The issue comes when I try to have it update the database for each record - it times out with nothing written to any error files I can find. The same thing happens when I just try retrieving the product id by its sku. I imagine this has to do with the fact that its trying to perform a whole bunch of operations at once.

Before I started trying to improve its performance it worked but took forever (between 20 minutes to a half hour) to run entirely and when I tried to make it write to a file when finished for testing (for example, how many skus had their stock updated) it didn't seem to make it that far.

I've tried using wpdb, pdo, and wp_query with no success.

I have similar issues when trying to use the REST api to get all of the products out of the database (a third party wants them), cURL would time out and I would increase the CURLOPT_TIMEOUT but I don't think thats a very sustainable fix as I imagine the time it takes to get all of the products would increase as more products are added and I'd have to go in and increase the timeout limit again. 20 products works just fine, all 300 and their variations not so much.

Anyways, any help or insight would be greatly appreciated :) thanks!

Please sign in or create an account to participate in this conversation.

Reply to

Use Markdown with GitHub-flavored code blocks.