Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

bhavesh-786's avatar

How to Handle Long-Running Scripts Without Affecting Other Users (Multi-threading or Parallel Processing in Laravel)

Hello everyone,

I’m working on a Laravel application, and I’ve noticed that when a large or heavy script runs (for example, fetching big data from the database), it causes the system to hang or become very slow for other users accessing the site at the same time.

Is there any way to create a multi-threaded environment or run such long processes in parallel so that they don’t block other requests?

Any suggestions or best practices (like queues, jobs, or async processing) would be really helpful.

Thank you in advance!

0 likes
14 replies
bhavesh-786's avatar

Thank you for your response let me check the above shared links

Glukinho's avatar

If your heavy task overloads your server queues won't help - a server will still be overloaded from a queue, no difference.

You should investigate is your issue about locking (extracting long tasks to queue helps then) or insufficient server resources (then you should add CPU/RAM/something to your server or do heavy tasks somewhere else).

bhavesh-786's avatar

thank you for your response, all are in well conditions I have checked this

bhavesh-786's avatar

Thank You for your response. I'll check the links you have provided. I am using the procedures and some queries are heavy so it's taking time due to that reason it's affection same time other users also. Trying to find the way if any multithreading we can

martinbean's avatar

@bhavesh-786 It might help if you gave us a little more context.

What are these long-running scripts doing? Why are they fetching “big data” from the database? What are they doing with this “big data”?

bhavesh-786's avatar

We are working with the reports which is having lots of joins and data also heavy so it's taking time for fetching. So due to that reason other users at the same time facing slowness.

jlrdw's avatar

Why not have a second DB you write to and use just for reports.

Also why so long? Run monthly and combine for quarterly. Do summery reports if detailed isn't needed. Archive older data in an archive database, etc.

Learn about partitioning also.

An example, at the logistics company I worked at, we would archive the previous year and start a new fiscal years data.

Study topics on database management.

Strategically design your reports for what's actually needed.

bhavesh-786's avatar

Ok I'll check on this topic thank you for your response.

martinbean's avatar

We are working with the reports which is having lots of joins and data also heavy so it's taking time for fetching. So due to that reason other users at the same time facing slowness.

@bhavesh-786 So you have two options:

  1. Set up a read replica of your database, and do you reporting on that connection so as not to slow down queries on your primary database.
  2. Write your data as time of creation to a far more appropriate solution (e.g. a proper data warehouse) that’s then used for reporting and analytics needs.
bhavesh-786's avatar

Dears, We're using Oracle Jobs to run a heavy query and populate a temporary/report table. Our issue is that when one request, such as a screen or small report, takes 10 to 20 seconds to fetch data, (other request from the another user), unrelated requests to the Laravel server are blocked and must wait until the previous process completes. Why is the Laravel server waiting for unrelated processes to finish, and are all server requests being processed sequentially in a queue?

Please or to participate in this conversation.