HartleySan's avatar

Is it possible to run code / DB queries in parallel in Laravel?

As the topic question says, is it possible to run code / DB queries in parallel in Laravel?

I have a use case where I need to run several DB queries called from Laravel model methods to load a page, and the queries are not reliant on each other at all. As such, it'd be great if I could run the model method calls in parallel and speed up the loading of the page.

Is this possible to do in Laravel, and if so, how? Thank you.

0 likes
27 replies
HartleySan's avatar

Thank you very much for the quick reply. So I guess it's a matter of creating a job to do the work I want done in parallel, and then dispatching a job for each of the processes to be run in parallel? Thanks.

Snapey's avatar

no its impractical. PHP is single threaded

Queued jobs with the right setup can process jobs in parallel but you have no practical method to retrieve the queried data

chaudigv's avatar

queries are not reliant on each other at all

If that's the case then I suggest to breakdown your code into multiple components. And call those components together.

Snapey's avatar

and don't overlook the overhead of booting the framework, checking the session, routing the request etc etc

sr57's avatar

Probably 2nd order in this case.

HartleySan's avatar

@chaudigv, what do you mean by multiple components? What is a component in Laravel?

Anyway, I am honestly a bit confused about what to do now, but I'm going to first attempt @sr57's recommendation and see where that gets me. Thanks.

HartleySan's avatar

To add on to my previous post, I just finished reading through the entire Laravel article on jobs and queues (https://laravel.com/docs/8.x/queues), and I'm starting to wonder if they're the best solution.

In order to do a bunch of DB queries in model methods in parallel, I would have to:

  1. Create a job to process each one (hopefully I could create just one job that could variably handle all of them).
  2. (I think) create a bus/batch to process all of the jobs in a batch (i.e., in parallel).
  3. Once I've dispatched all the jobs, because there's no way (I can immediately see) to return the data processed by the job, I'd have to, for example, store the processed results temporarily in a DB/Redis, etc. and then grab the data once all jobs are done, which seems to incur a lot of extra overhead.

Long story short, it doesn't see ideal.

As it stands, I'm using Promise.all() in JS after the page loads to process a bunch of requests in parallel. Time-wise, it works great, but I never liked the effect of part of a page loading on page-load, and then a bunch of other parts of the page popping up after the fact. Basically, I was hoping to process everything (in parallel) on the back-end to both get a fast-loading page and everything on the page on load.

trin's avatar

parallel db query in php, mmmm, sound like «i need more phantom bugs» ) really, you not need to parallel your db queries in your model. if you need to speedup your app, change server ) or use cache

HartleySan's avatar

For what it's worth, my app is already performant, and it's not really an issue, but I'm trying to make it even faster / think ahead a bit.

All the same, after doing some more searching, I found this article and realized that you can do Promise.all-like logic with Guzzle in Laravel:

https://medium.com/@ardanirohman/how-to-handle-async-request-concurrency-with-promise-in-guzzle-6-cac10d76220e

I'm not sure I want to go this route, but I can easily use Guzzle to send parallel API requests to my own API and get the data that way. It does work, but again, not sure that's the route I want to go.

1 like
trin's avatar

you not need it, really. from promise.all style for query, you can take little performance optimization, and alot of bugs and labirint logic.

  1. use eaccelerator/apc for optimize php time
  2. use db cache for optimize db time
  3. use explain/profile for optimize heavy db query
HartleySan's avatar

I greatly appreciate your advice, trin, and truth be told, I pretty much agree with you. It's just that in this particular case, the user themselves (admins, at least), are the ones writing the queries, and there are times when the queries they write are not that optimized.

As such, it helps. Also, to make things as flexible as possible, I provided the option for users to choose whether to load everything on the back-end pre-page-load or to use Promise.all in JS to load stuff.

trin's avatar

interesting case ) if i had a similar problem, i would solve it at the queue level. but if the queries are simple and need to be done at the laravel level, I would not, as I said before, solve this problem.

HartleySan's avatar

trin, when you say "solve it at the queue level," what do you mean? How specifically would you solve this problem using queues? I still don't get that. Thanks.

Snapey's avatar

Queued jobs cannot really solve your problem because once you dispatch the job to the queue, you have no idea when it will actually run (depends on the queue load) and you cannot practically get the results back into the request cycle.

If you break the work into multiple xhr requests from the client then each request will be run in parallel in a separate PHP process and return results back to the client.

This would be the best solution in your example.

HartleySan's avatar

That's exactly what I did originally using Promise.all. And because it works well, I decided to keep it, but I also wanted the back-end-only solution, as the former has some unintended side effects.

For example, some of the pages displayed have a lot of data, and if a user scrolls down to some data, goes to another page via a link, and then decides to go back, with the former, because there is technically no data on the page when it first loads, you end up at the top of the page again, which is frustrating for some users. By having a back-end solution as well, it'll always take you back to the exact spot on the page you were at.

As I mentioned to trin, I ultimately am giving the users the ability to choose which route they want to go, and if it turns out that the back-end solution is too taxing on the system, then I'll force the front-end-only solution on users and they'll just have to live with it.

So far as I can offer both solutions though, I will. Thanks for your help and advice.

Snapey's avatar

consider caching the results of the query

sr57's avatar

@hartleysan

To your original question, my first answer (sync jobs) is correct.

To your need :

Basically, I was hoping to process everything (in parallel) on the back-end to both get a fast-loading page and everything on the page on load.

I wonder if it's useful ...

Let's imagine you have 2 complex / heavy sql and are not able to optimize them (explain, ...). Time will be spend mainly in disk IO and CPU caluclation (system and 'web' times are 2nd orders).

Theorically lauching the 2 sql in // should be usefull, os will have more ability to optimize calculation et disk io on the 2 sqls in // than the 2 in a row.

Practically it depends of the need of memory that should be important for heavy sql and // will improve few, nothing or even be worse.

And furthermore this suppose only one user! on classic system, ie multi users system, os will otimize everything and you'll gain nothing in total time doing the jobs in //.

Snapey's avatar

To your original question, my first answer (sync jobs) is correct.

with respect, you are seriously mistaken

sync jobs are not queued at all, they are executed in line, and the rest of the code waits for it to finish. Just the same as having the code directly in the request path (eg the controller)

sr57's avatar

Thanks @snapey

@hartleysan The beginning of my previous post is : "To your original question, my first answer (sync jobs) is totally wrong"

The rest of my answer should be correct ... until @snapey 's correction :-)

In fact, new to Laravel, I've only use one sync job and @snapey makes me understand one pb in my code.

That said, Laravel is php and you can still use exec with a command ending by & (in linux) to have jobs in //.

HartleySan's avatar

I appreciate all the advice and help, guys, but thus far, using Promise.all in JS for front-end processing and Guzzle promises on the back-end seems to be the best bet thus far. Thanks.

1 like

Please or to participate in this conversation.