Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

patressz's avatar

Laravel multiple workers

Hello guys, i have an Laravel application deployed on digitalocean and my app do some jobs. I already have this scheduler configuration:

    /**
     * Define the application's command schedule.
     */
    protected function schedule(Schedule $schedule): void
    {
        $schedule->command('queue:work --queue=process_photo')
            ->withoutOverlapping()
            ->everyMinute();
    }

i have also set up cron, which execute artisan command schedule:run every minute.

But i have a problem because it's too slow for my needs. I'm considering to solve this problem with set up multiple containers on digital ocean, for example 4 containers, which has be do that job, but i'm not sure how i can prevent to execute one job twice, so on two servers in the same time.

I know there is an option onOneServer(), but i'm not sure how it works. I will be glad for every suggestion, thank you :)

0 likes
4 replies
patressz's avatar

@martinbean thank you for your answer. But in my case I already have deployed application on multiple servers, in my current case on two servers. Each server has configured 5 workers for the same job. Is there chance to two workers run one job in the same time and process it twice?

martinbean's avatar

@patressz No. The entire point of queues are jobs are processed once and exactly once.

Either way, you should not be running your queue workers using the scheduler. You should be running queue workers as deamon processes on your server(s), like the docs tell you to.

Snapey's avatar

you have cited my solution for running a single queue in a shared host where supervisor is not available

what makes you think you have 5 queue workers running?

Please or to participate in this conversation.