Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

goldtaste's avatar

Maximum execution time of 600 seconds exceeded

I have a cron job that runs a php script from the command line:

cd /home/myproject/ && php -d register_argc_argv=On artisan run:jobs

I assumed that there would be no max execution as the php script is effectively getting run at the command line.

And in the past, the script has worked and run for 3 or 4 hours.

I tried running the script with more records (I have successfully processed 30,000 records, I am now trying to process 45,000 records). and got a timeout 3/4 quarters of the way through the job.

In the script I am adding batch users to an external site through api calls.

and the error I'm getting is:

Maximum execution time of 600 seconds exceeded {"exception":"[object] (Symfony\Component\ErrorHandler\Error\FatalError(code: 0): Maximum execution time of 600 seconds exceeded at /home/myproject/vendor/guzzlehttp/guzzle/src/Handler/CurlFactory.php:563)

What I'm not sure about is if this issue is related to guzzle and some sort of call that took over 10 minutes (600 secs). Is the maximum execution specific to guzzle?

Or is it just that the php script is taking too long? But as mentioned this script is getting run as a background process by a cron job, so I think this shouldn't be an issue. Am I wrong? As mentioned, I have run jobs that take over 3 to 4 hours. And not just the 10 minutes as suggested by the error. This job in question ran for about 4 hours before the error was thrown

Please note, that I will probably look at batching the API calls going forward, so the background jobs can be shorter.

Thanks in advance to anyone who helps.

0 likes
1 reply
LaryAI's avatar
Level 58

The error message "Maximum execution time of 600 seconds exceeded" indicates that the script is taking longer than the maximum execution time allowed by PHP. This is a common issue when running long-running scripts or making API calls that take a significant amount of time.

To resolve this issue, you can increase the maximum execution time in your PHP configuration. Here are a few ways to do it:

  1. Modify php.ini: Locate your php.ini file (you can find its location by running php --ini in the command line) and open it in a text editor. Look for the max_execution_time directive and increase its value to a higher number, such as 18000 (5 hours). Save the file and restart your web server.

  2. Modify .htaccess: If you don't have access to the php.ini file, you can try modifying the .htaccess file in your project's root directory. Add the following line to increase the maximum execution time:

php_value max_execution_time 18000

Save the file and check if the new maximum execution time is applied.

  1. Modify the script: If you want to increase the maximum execution time only for this specific script, you can add the following line at the beginning of your PHP script:
ini_set('max_execution_time', 18000);

This will override the default maximum execution time for this script only.

Remember to choose a reasonable value for the maximum execution time based on the requirements of your script. Setting it too high may cause other issues, such as memory exhaustion or timeouts.

Additionally, as you mentioned, batching the API calls can help reduce the execution time and prevent timeouts. You can process a certain number of records at a time and then pause the script before continuing with the next batch. This way, you can avoid hitting any time limits imposed by the API or PHP.

I hope this helps! Let me know if you have any further questions.

Please or to participate in this conversation.