what do you have as the timeout value ? It could be that the other worker thinks the job must have failed because the execution timeout has passed?
This may also give hints
https://divinglaravel.com/explaining-laravel-queue-configuration-keys
Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.
I have a main server that dispatches long running jobs (+8 min per job) to a SQS queue. I have two separate worker servers listening to the same SQS queue, let’s call them worker A and worker B.
The issue I have is that when worker A is processing a job and I start worker B, worker B tries to process the job that A is already processing. The result is that worker B reports a job failure (job running too long or max attempts reached). Is like when worker A picks a job, this job is not locked in the queue and can be picked up by other workers.
I actually don’t run one worker and then another. What actually happens is that when worker B finishes one job it will try to process the already processing job of A.
I cannot find anyway to block an already processing job in the documentation. The only solution I see is that my main server dispatches the job to two different queues depending on the load of my worker servers.
Am I missing something basic to make this work?
Please or to participate in this conversation.