Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

dmarman's avatar

Worker B tries to handle already processing job by worker A.

I have a main server that dispatches long running jobs (+8 min per job) to a SQS queue. I have two separate worker servers listening to the same SQS queue, let’s call them worker A and worker B.

The issue I have is that when worker A is processing a job and I start worker B, worker B tries to process the job that A is already processing. The result is that worker B reports a job failure (job running too long or max attempts reached). Is like when worker A picks a job, this job is not locked in the queue and can be picked up by other workers.

I actually don’t run one worker and then another. What actually happens is that when worker B finishes one job it will try to process the already processing job of A.

I cannot find anyway to block an already processing job in the documentation. The only solution I see is that my main server dispatches the job to two different queues depending on the load of my worker servers.

Am I missing something basic to make this work?

0 likes
2 replies
dmarman's avatar

I actually have 1200 seconds set. Is like worker B doesn't know that the already processing job is reserved.

Please or to participate in this conversation.