Console Command called from the web dies early

Published 2 years ago by TonsOfLaz

Hi, I have a console command that takes about 20 minutes to run. It needs to be kicked off by an admin from the web, so I call it like so:

Route::post('send-email', function() {
            $exitCode = Artisan::call('myapp:send');
        });

In my Console Command, I have added these lines to the top:

ignore_user_abort(true);
set_time_limit(0);
ini_set('max_execution_time', 5000);
ini_set('memory_limit', '100M');

class SendMyAppEmail extends Command
{

But it never gets beyond about a minute or so. If I run it directly from the console, it completes with no problems.

Does anyone know why this process would crash when kicked off from the web interface but not the console? Is there a good way in Laravel to run this command in the background the same as if it was run from the console?

Thank you very much for your time.

thisisdeactivatedaccount

You can use Task scheduling to do it, read it on: https://laravel.com/docs/5.3/scheduling

TonsOfLaz

Hi @HenryDinh , thanks for the response. Unfortunately it needs to be started by an Admin when it is ready to send, not on a specific schedule. While there may be a roundabout way to use the scheduler, I am hoping there is a way I can run this command as necessary and have it run to completion.

nate.a.johnson

I don't know why you'd want to risk it timing out or why you'd want a browser to hang for 30 minutes. Your begging for all kinds of trouble. Just let the admin click a button that puts a job in the queue to run immediately. Then have the scheduler look for those types of jobs every minute. You could even have a page that lets the admin know the job is running or send an email/slack notification when done.

TonsOfLaz

Hi @nate.a.johnson , thatnks for the reply. I kick off the job with an ajax call, and set it to not abort when the window is closed, so it shouldn't tie up the browser at all.

I have avoided using queues so far, and would prefer not to unless absolutely necessary. I have had issues with serialization of the (many) objects involved in the process not working correctly when the queued jobs run, so I didn't want to have to wrestle with that on a process that already works.

If it is impossible to let a php process run for 20 minutes in the background, so be it. But I didn't think that was the case.

nate.a.johnson

I run the scheduler with jobs all the time but I've never once used a queue. It's just a cron job that kicks of one of your commands.

nate.a.johnson
nate.a.johnson

Lastly, all you'd need to do is create a table, and when the admin starts a job, insert a flag in the table. Then the task can just look at that table every minute, and if the flag is set, go! When done remove the flag. Basically just simple job queuing.

TonsOfLaz

I use the scheduler for many items (without a queue) as well, but this one needs to run only when the Admin clicks to run it. Is there a way to use the scheduler to kick something off on demand without using queues?

Adding my own queue/flag system has other complications. It would need to check if the job is currently running so it doesn't run multiple times, if the job failed there would need to be a check to clear out the queue flag, etc etc. I am hoping to avoid those complications and just run a long process when it is clicked to run. Sounds like that might not be possible? Or maybe I have the max_execution_time and set_time_limit set in the wrong place?

Please sign in or create an account to participate in this conversation.