Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

JimNayzium's avatar

Job vs. Scheduled Task vs. Cron Fundamental Questions

I am starting a few training videos but wanted to ask a question here before I started.

My app will have an on-going perpetual updating of my own mysql database from a third-party API set of data. In addition to that, the front end will be hitting that database with frequent calls for the data to use in my application/game.

I currently have Scheduled artisan commands running so the ONE cronjob running for Scheduler every X minutes/days/weeks was a Godsend!!

I think I can further optimize everything though based on things I've picked up here and there from different Jeffrey videos! But I am not sure I understand the difference in Jobs and Commands and Scheduled Tasks etc.

If my server is running Scheduled Commands while users are querying the database for the data, I currently believe that my DB::transactions will make sure the data is consistent, but I am curious if the actual processes are slower than they need to be since I don't use JOBS at all currently.

I am wondering if fetching the API external data and updating the database with the Laravel backend could be happening completely independently of serving the responses to the front-end fetch queries from javascript for the actual data from the database. And I am afraid currently those two things could be happening simultaneously and therefore slowing down unnecessarily.

Is that what jobs are for? Apologies for the naive question if so. I will return here and update with what I learn in the process. But it was my quick understand while watching an early video by Jeffery that it sort of is like making the two things parallel to each other like the multiverse. Api/updating of the database is in one river and the boats on that river are all going back and forth, and on a completely different river a thousand miles away the fetches and queries for the data are being made to my server and never the two shall cross type thing.

Anyway, I appreciate everyone's input and awesome videos on this great site!

THANKS!

0 likes
9 replies
LaryAI's avatar
Level 58

To address your question about the differences between Jobs, Scheduled Tasks, and Cron jobs in Laravel, and how they can be used to optimize your application, let's break down each concept and how they can be applied to your scenario.

Scheduled Tasks

In Laravel, scheduled tasks are typically defined in the app/Console/Kernel.php file using the schedule method. These tasks are executed by a single cron job that runs every minute, which then checks the schedule and executes any tasks that are due. This is a great way to handle tasks that need to run at specific intervals, such as fetching data from an external API.

Jobs

Jobs in Laravel are used to handle tasks that can be processed in the background. They are typically dispatched to a queue, which allows them to be processed asynchronously. This means that your application can continue to handle requests without waiting for the job to complete. Jobs are ideal for tasks that are time-consuming or can be processed independently of the main application flow, such as updating your database with data from an API.

Cron Jobs

Cron jobs are a Unix-based utility that allows you to schedule scripts or commands to run at specific intervals. In the context of Laravel, you typically only need one cron job to run the Laravel scheduler every minute, which then handles all your scheduled tasks.

Optimizing Your Application

Given your scenario, you can optimize your application by using a combination of scheduled tasks and jobs:

  1. Scheduled Task for API Fetching: Use a scheduled task to periodically fetch data from the third-party API. This task can be set to run at intervals that make sense for your application (e.g., every 5 minutes).

  2. Jobs for Database Updates: Once the data is fetched, dispatch a job to update your database. This job can be queued and processed in the background, ensuring that it doesn't block your application from handling other requests.

  3. Database Transactions: Continue using database transactions to ensure data consistency when updating your database. This will help prevent any data corruption during concurrent updates.

  4. Queue Configuration: Make sure your queue is properly configured to handle the jobs efficiently. You can use different queue drivers (e.g., database, Redis) based on your needs and infrastructure.

Here's a basic example of how you might set this up in Laravel:

// In app/Console/Kernel.php
protected function schedule(Schedule $schedule)
{
    $schedule->call(function () {
        // Fetch data from the API
        $data = $this->fetchDataFromApi();

        // Dispatch a job to update the database
        UpdateDatabaseJob::dispatch($data);
    })->everyFiveMinutes();
}

// In app/Jobs/UpdateDatabaseJob.php
public function handle()
{
    DB::transaction(function () {
        // Update the database with the fetched data
        foreach ($this->data as $item) {
            // Update logic here
        }
    });
}

By using this approach, you separate the concerns of fetching data and updating the database, allowing them to run independently and efficiently. This should help reduce any potential slowdowns caused by simultaneous operations.

2 likes
Snapey's avatar

Think in terms of processes

one process serving the user request

one process processing the scheduled commands

Both running at the same time and both hitting the database potentially at the same time, although you need a lot of web requests and commands for them to start to conflict with each other.

Jobs are no different, they are just a way to package some work so that a worker can come along and process the work without slowing web responses.

3 likes
experimentor's avatar

@jimnayzium

Commands are pieces of code we wish to execute outside the User Request <=> Response loop. For example, let's take this logic: If a subscriber's renewal date is 5 days from today, we need to send them a notification email. This logic needs to work in the background. This will not be triggered by a request from a user. This is where Commands, Task Scheduling and Jobs come into the picture.

  • Create a command to collect all the subscribers whose renewal date is 5 days from today.
  • In the command create a job for each subscriber to receive a notification email. If there are 200 subscribers who need to receive an email, I would create 200 jobs (queued).
  • I would have an adequate number of workers deployed to process these jobs quickly and dispatch the emails in parallel
  • Lastly we should Schedule a task with the above mentioned command to run every day.

So, the command, scheduled task, the jobs and workers all come together to automate this process of notifying our subscribers.

Regarding your concerns about speed / slowing down. Here are ways to optimise:

  • Use separate servers for different purposes. I use one set of servers for my users' requests, another for executing long running cron jobs, and server-less lambda functions for executing jobs.
  • Even if the servers are separated for various concerns, the bottle neck is our single-source-of-truth, the database server. Options to reduce the load on DB would be using read replicas of database for heavy read operations. Caching also is a very effective strategy for frequently requested data. AWS Aurora DB is very effective at seamlessly scaling DB load, but it is a bit expensive.

Here are a few links to help you out. https://aws.amazon.com/rds/features/read-replicas/ https://docs.aws.amazon.com/lambda/ https://aws.amazon.com/autoscaling/ https://aws.amazon.com/rds/aurora/

4 likes
JimNayzium's avatar

These are all very helpful, everyone! Thanks so much.

The A.I. response actually taught me something that I think my amateur coding didn't quite grasp, and that is to separate the fetching of the data from the updating of the database.

I had, of course, written those things into different functions, but they were all part of the "in sequence calls" in the ONE job. And I realize now I want one job for the fetching of the data, and a second job for the updating of the database, so they can run independently from each other and I can just pass the fetched DATA to the Update class's construct and access it in the handle() of the job to update the database.

Everything else you all have said makes good sense, and Jeffery's video on Jobs with the Pizza delivery animations at the beginning was super easy to follow.

Appreciate it everyone!

1 like
Snapey's avatar

You would not use a job to return anything as it has no return value other than success or failure.

1 like
Snapey's avatar

basically, don't overthink it. Controllers are used for fetching and updating the database. When you need jobs or scheduled tasks, you will know.

1 like
JimNayzium's avatar

Quick follow-up concept question.

If I am scheduling artisan commands, and those artisan commands are in-turn calling functions from a class that create Bus::chain() or Bus::batch() jobs, then I am not going to need the $schedule->job() method am I?

Meaning, the artisan commands seems the fastest and most straight forward way to test my API for my purposes. It's so easy to just have a terminal open and run the artisan command I want to see the output of in the tail of laravel.log type thing. And use x-debug to step through it as I code it up.

THEN, I have these great little artisan commands that I've used to test it all along, so now I feel like I can just do:

Schedule::command("MyCustomCommand:MyCustomProperty my-custom-parameter")->everyTwoMinutes();

Am I understanding it correctly?

Snapey's avatar

@JimNayzium If you want tests, write tests, not commands.

If you want something to happen at a specific time, schedule a closure, a job or an artisan command (it matters nothing to the scheduler)

If your code takes a long time to run (several minutes) then make it a job so that it can be started and left to do its own thing whilst the scheduler continues to fire every minute

Please or to participate in this conversation.