You may need a load balancer setup.
How can I optimize Server Resource with Laravel and Docker?
My server is have always running cron jobs. Some times multiple cron is running at same time. So its utilize full resource sometimes. How can I avoid this thing? This is Laravel App running on docker. This VPS comes with 64 GiB RAM and 32 VCPU. Anyone have any recommendation to optimize this app?
@jlrdw Thank you.
@sanjayacloud optimize your cron job schedule to distribute the workload evenly throughout the day. Avoid scheduling multiple resource-intensive tasks to run simultaneously. Use Laravel's task scheduling features to stagger the execution of cron jobs over time. You can use the withoutOverlapping() method to prevent a task from starting if another instance of the same task is still running.
@enoch91 The one task is running every 30 minutes. And It setup like this.
class UpdateInventoryCustomFields implements ShouldQueue, ShouldBeUniqueUntilProcessing
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels, Log;
/**
* The number of seconds after which the job's unique lock will be released.
*
* @var int
*/
public int $uniqueFor = 2400;
/**
* The number of seconds the job can run before timing out.
*
* @var int
*/
public int $timeout = 2400;
/**
* The number of seconds after which the job will be released back to queue.
*
* @var int
*/
public int $releaseAfter = 150;
/**
* Indicate if the job should be marked as failed on timeout.
*
* @var bool
*/
public bool $failOnTimeout = true;
private Store $store;
/**
* Create a new job instance.
*
* @return void
*/
public function __construct(Store $store)
{
$this->store = $store;
$this->queue = 'high';
}
/**
* The unique ID of the job.
*
* @return string
*/
public function uniqueId(): string
{
return $this->store->bc_store_hash;
}
/**
* Get the middleware the job should pass through.
*
* @return array
*/
public function middleware(): array
{
return [(new WithoutOverlapping($this->store->bc_store_hash))->shared()->releaseAfter($this->releaseAfter)->expireAfter($this->uniqueFor)];
}
/**
* Execute the job.
*
* @return void
* @throws Exception
*/
public function handle(): void
{
$facade = App::make(InventoryCustomFieldFacade::class);
Auth::setUser($this->store);
try {
$errors = $facade->updateBigCommerceInventoryCustomFields();
} catch (Exception $e) {
$errors[] = $e->getMessage();
}
if (!empty($errors)) {
$this->log(Auth::user()->name, 'Errors: ', context : $errors, level : 'error', channel : 'inventory_fields');
$this->fail(new InventoryCustomFieldUpdateJobFailed($errors));
}
}
}
@sanjayacloud don't forget its as important to optimise queries in jobs as much as queries in controllers
yet, its harder to spot expensive processes and missing indexes that could make a massive difference to execution time
@Snapey I have already optimize queries as much as I can. There are no any duplication queries or any unused queries with this process.
@sanjayacloud so what are the execution times of these jobs? I expect you have this info
Please or to participate in this conversation.