AlsonicTech's avatar

Queue solution for shared hosting

We all know that, when your website is installed on a shared hosting, your permissions are limited. So you don't have access to the root console to start installing Beanstalkd, Supervisor or other queue solutions.

What solution do you use for shared hosting, when it comes to queuing?

0 likes
34 replies
jekinney's avatar

Yes, set queue driver to database, run the migrations on the docs. Set up the cron to check/restart queue listen.

jekinney's avatar

@Kennyendowed There isn't any code to show. It's configuration and the Laravel docs go step by step to set up a db queue. Off the top of my head you run an artisan command to run the migration to create and set up the queue tables (one for queue and one for failed). Set env variable to database and done.

nfauchelle's avatar

Good to know a database one was added in 5.0+! Had to use a module for 4.2, and a database one is useful for smaller projects that just have the occasional job that needs to be ran in the background...

jekinney's avatar

Also great for local dev and troubleshooting instead of setting up a more sophisticated approach right away. As file is synchronous database is asynchronous so helps keep browser testing fast.

ankurgoel's avatar

I need to process the files uploaded by the user. If i process the file when user loads the file then if many user logs in at the same time then server will be heavily loaded. So i am thinking to use queues using database. But i am not sure how to schedule the cron job because job need to be run when the user uploads the file.

If the processing job is busy then file will be added to the queue and will be processed by the job. Please help me as i need to host on the shared hosting where i do not have the ssh access.

I need to know how to schedule the cron job if cron job is the only option or if there is any other solution.

andfelzapata's avatar

I need help with this too please !!

Right now I have this in my Console/Kernel.php class:


// Process pending jobs.
        $schedule->command('queue:restart')
            ->everyFiveMinutes();

        $schedule->command('queue:work --daemon')
            ->everyTenMinutes()
            ->sendOutputTo(storage_path() . '/logs/queue-jobs.log');

But it seems it doesn't always work =(, is there no way to install supervisor ? I have hostgator shared hosting.

daem0n's avatar

What I have done in a similar situation (I have shared cpanel hosting and no way to install supervisor) I used the cron to run laravel's scheduler every minute and inside I used this:

$schedule->command('queue:work --daemon')->everyMinute()->withoutOverlapping();

So whats happening is, the first time it runs, it will start the queue worker in daemon mode, then on every minute it will basically use the withoutOverlapping to only run it again if the previous one crashed/exited/no longer running. This essentially produces a supervisor like functionality. At worst case it will take a minute before the queue worker comes back up in case of a failure / memory limit hit, but in most cases the first thread will stay alive and work through the queue. This is a better way to accomplish queue:listen like functionality without supervisor.

The caveat is of course, you need to run queue:restart whenever you deploy new code so that the daemon worker will restart and get fresh app code.

Hope this helps.

28 likes
usman350's avatar

@daem0n Thanks It saved my day and clear my mind about Queue worker in shared hosting

AshAsley's avatar

Hi Can anyone answer how to handle php artisan queue:listen on bluehost for database as default driver. I Tried same in local devlopment through php artisan queue:listen, it worked fine but when I tried to run the same command in cron job it is not working.Someone please help

1 like
spekkionu's avatar

To run artisan commands in a cron you will need to provide the full path to the artisan script and on some hosts the full path to the php executable as well.

/usr/bin/php /path/to/project/artisan command
2 likes
AshAsley's avatar

Sir Spekkionu how can we find the full path to php executable. Also can you help me in this matter : How can we queue email in Laravel 5.2 in online server.

For local development it's working absolutely perfect, but when I upload to cpanel its not working at all.

I have done the following things on my local server that work absolutely fine :

Queuing file configuration: +go to env file and set the queue driver to database

+goto composer and create database tables as "" php artisan queue:table "" also run "" php artisan queue failed-table "" and finally run "" php artisan migrate ""

+the database tables will be created and migrated

+Now in order to queue jobs or send email after a few seconds I have entered this syntax : Mail::later(5,'frontend.general.checkout.email.orderby',array('maildata' => $maildata),function($message) use($maildata) { $message->to($maildata["email"],'Ali')->subject($maildata["subjectorderby"]); });

+In order to listen for the queue jobs run the composer command "" php artisan queue:listen ""

---------------------------This all works fine in my local server------------------------

Now for cpanel I have written same thing but it did'nt worked.Also I set up the cron jobs time and command same as php artisan queue:listen. No email is recieved after 5 seconds so far.

I have spent almost a week struglling for this but could'nt find appropriate answer to this question

AshAsley's avatar

Spekkionu I entered the following command : /usr/bin/php /home1/buydsell/public_html/artisan queue:listen

Got the following response from bluehost in email. Status: 500 Internal Server Error Content-type: text/html

pilat's avatar

@DIEGOPONCIANO - Turns out that flock is only good for preventing from running the same command more than once. What it doesn't:

  1. it won't register if a listener has hung up (i.e., it's still in ps axf, but the queue is not moving)
  2. it does not guarantee removing lock file after process has finished; so, next time it wont start…

Discovered it recently — it looks that something's changed on the server and now they kill listeners quite often…

pilat's avatar

@DAEM0N - I have a question: how does it play with other scheduled (not queued) tasks? I mean, won't this command take all time to execute and other's won't simply have a chance? Or, will others be processed just on the next minute's execution, because of that withoutOverlapping()?

howtomakeaturn's avatar

My solution is:

$schedule->command('queue:restart')
    ->everyFiveMinutes();

$schedule->command('queue:work --daemon')
    ->everyMinute()
    ->withoutOverlapping();

Currently it is working. Will try a few days to see if there's any issues.

mrfash's avatar

Am I missing something? copied this and pasted it on my kernel not still working

Snapey's avatar

@mrfash you still need a cronjob to run the scheduler every minute. Have you set that up?

yahoo30000's avatar

@howtomakeaturn somehow queue:restart command, disables Queue service.

I'm just using queue:work --daemon, and everything's fine; And Queue works about 6 to 10 hours and then get restarted and then works normally.

Is it necessary to use queue:restart ???

Snapey's avatar

@yahoo30000 queue restart is recommended but all it does is signals to the worker to perform orderly shutdown. It is assumed that you have supervisor and that will notice the worker has stopped and restart it. Suggest you create a Cron job to shutdown and restart the queue worker every few hours

Kingdomac's avatar

I am trying to deploy laravel app with breeze inertia ssr to aws elastic beanstalk, but it failed to run php artisan inertia:start ssr command as all node and npm commands cannot be run on eb, is their any solution?

Please or to participate in this conversation.