Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

nil-to-null's avatar

Long Running Tasks on Vapor

Has anyone come up with a clever solution for long running tasks for a codebase on Vapor? The max execution time is 15-minutes, will not be long enough for some of the daily aggregation tasks that my company runs overnight.

I was wondering if anyone had come up with something clever?

My current idea consists of deploying a "worker" machine to an EC2 instance using Forge and then having it run the longer tasks. I don't really like this idea, as I would have to then manage multiple deployment tools for the same code base.

Any one else have any good ideas?

0 likes
7 replies
Snapey's avatar

split the aggregation tasks into smaller queued jobs running in a batch ?

nil-to-null's avatar

The problem is that the aggregation has to download files that sometimes take 30-40 minutes to download from external services. I have no control over how long it takes to download these files... I was looking at AWS Fargate but I don't know if it's possible to grab the containerized version of the Vapor build to the be used to deploy to AWS Fargate.

nil-to-null's avatar

@navneet I have not come up with an elegant solution. The solution that I created require me to also deploy to an EC2 instance for longer running tasks.

This is a problem that I am going have to solve again shortly, for other reasons, so I will post back on here if I come up with something better.

navneet's avatar

@nil-to-null thanks for the reply. I have also facing the same issue, my export task requires the background job to run for 4-5 hours to export 10 million rows.

You are using vapor with EC2 instance for longer tasks? How does that work?

1 like
J5Dev's avatar

@navneet Just found this thread looking for a neater solution, and as yet haven't found a 'clean' one, but I can add./confirm that this seems to be the best approach currently.

Essentially we have developed a 'side' application which runs on a more traditional EC2 instance, and have extracted any long running tasks out of the Vapor app and into the side application, then simply have them both able to access the DB, file storage etc.

Obviously it gets a tad complicated around minimising duplication etc. so you would need to look to package any shared logic and just have both apps pull it in, as a way of minimising maintenance and updates etc.

Think of it as starting the journey into micro services... kind of a dipping your pinky in approach :)

jseitel's avatar

Maybe S3 events could be helpful in this case? Create a dead-simple worker node (can be anything, doesn't have to be Laravel) that can be called from your laravel queue or just run on a cron. All it does is transfer files from source to S3 bucket. Configure S3 to fire an event on completion and call an endpoint in your Laravel app to handle any kind of post-processing.

Did something similar recently for transcribing call data and it worked like a charm.

More here: https://docs.aws.amazon.com/AmazonS3/latest/userguide/EventNotifications.html

Please or to participate in this conversation.