laracoft's avatar

Rate limiting mailables

use Illuminate\Cache\RateLimiter;
use Illuminate\Support\Facades\RateLimiter as RateLimiterFacade;

// In AppServiceProvider boot method:
RateLimiterFacade::for('mailing-list', function (object $job) {
    return $job->user->rateLimit(5)->perMinute(); // A - defined first time
    // Or a global limit:
    // return Limit::perMinute(5);
});
class SendUserEmail implements ShouldQueue
{
    // ...
    public function middleware(): array
    {
        return [
            // Define the rate limit: e.g., 5 emails per minute
            (new RateLimited('mailing-list'))
                ->allow(5)->every(60)   // B - why do this again?
                // If rate limited, release the job back for 60 seconds
                ->releaseAfterSeconds(60),
        ];
    }
}
  • AI suggested the above code to me
  • Why do we need to define the rate limiting 2 times in A and B?
0 likes
8 replies
Glukinho's avatar

What are you're trying to achieve in the first place?

As for me, I'm not ready to fix AI generated problems, people generated are more than enough...

laracoft's avatar

Ok, I want to slow down sending emails, say send only 10 emails every minute, but I don't want any emails to be dropped and they have to be sent from a job running on a queue

Glukinho's avatar

AppServiceProvider

RateLimiter::for('mail', function (object $job) {
    return Limit::perMinute(10);
 });

Job

public function middleware(): array
{
    return [new RateLimited('mail')];
}

Also don't forget to increase job's $tries / $maxExceptions / retryUntil() as by default a job is run only once and it will be marked as failed after first rate limiter hit.

https://laravel.com/docs/12.x/queues#rate-limiting

Also note that rate limiting doesn't guarantee smooth consumption, which means if you have 100 jobs in a queue 10 of them will be dispatched quickly (in 5 seconds lets say) and worker will wait next 55 seconds until new minute begins. Try to play with releaseAfter(...) or perSecond(...) to make it smoother through time.

laracoft's avatar

where do I add releaseAfter and perSecond?

Glukinho's avatar

Did you read the docs?

I suggest you first try simple Limit::perMinute(10) as we already gave you, play with complex ways if it doesn't satisfy your case.

DigitalArtisan's avatar

When you prompt AI correclty, you should get this:

The Correct Way to Throttle Emails in Laravel with Jobs and Queues

Step 1: Use Queue Middleware (RateLimited)

This is the core part of the solution. To throttle emails, you add middleware to your job to enforce the rate limit:

use Illuminate\Queue\Middleware\RateLimited;

class SendEmailJob implements ShouldQueue
{
    public function middleware()
    {
        return [
            new RateLimited('mailing-list'),
        ];
    }

    public function handle()
    {
        // Logic to send the email
    }
}

This ensures the job respects the rate limit defined later.


Step 2: Define Rate Limit in AppServiceProvider

In your AppServiceProvider@boot(), you define the rate-limiting logic:

use Illuminate\Cache\RateLimiter;
use Illuminate\Cache\RateLimiting\Limit;

public function boot()
{
    RateLimiter::for('mailing-list', function (SendEmailJob $job) {
        return Limit::perMinute(10)->by($job->userId);
    });
}

This sets a 10 emails per minute limit for each user (based on userId).


Step 3: Use Redis as Queue and Cache Driver

Redis is required for distributed rate-limiting. Ensure your .env file is configured like this:

QUEUE_CONNECTION=redis
CACHE_DRIVER=redis

Redis ensures the limit is applied safely across multiple workers or instances.


Step 4: Run the Queue Worker

Run the queue worker to process the jobs:

php artisan queue:work redis

This worker will now respect the rate limit (10 emails per minute in this example) and retry any jobs that exceed the rate limit.


Key Points:

  • Only one correct method, with 4 steps to implement.
  • Redis is mandatory for safe, scalable rate-limiting.
  • RateLimited middleware handles job throttling.
  • No emails are dropped—they’re released back to the queue when the limit is hit.
Snapey's avatar

In my experience this never works.

The queue driver picks up the job then discovers it cannot send. It then burns one of the tries. After a short time all jobs fail.

Better to find a good mail provider with no restrictions

laracoft's avatar

They are all software, so I do wonder, what is Laravel not getting right?

Please or to participate in this conversation.