Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

AdamT's avatar
Level 9

Scaling application across multiple servers when using a Redis queue?

Good afternoon:

I'm currently looking at scaling my application over multiple servers to speed up my Redis queues. I'm sending thousands of e-mails which simply eats through the RAM of my single server and sends the CPU usage through the roof.

I'm currently using Redis to cache the queue. Since I need to store the large amount of jobs I'm planning on creating a cache server to store the jobs until they are processed. I'm then planning to create a worker server to process the jobs stored on the Cache server.

I'm new to scaling out an application so was wondering if this was an appropriate way to organize my servers when using Redis to manage the queues?

Many Thanks, Adam

0 likes
3 replies
Snapey's avatar

not addressing your problem directly, but make sure you only queue the requirement to send an email, not the mail itself.

2 likes
kobear's avatar

I have a production app that is processing about 2 million jobs an hour. It does not send email, but it does metric inserts into an InfluxDB database using the HTTP API, so similar in that it relies on a out of process wait time.

This is all running on a single server right now (InfluxDB, Redis, and 60 Laravel queue worker processes). However I want to split it up in the future.

If you are interested in what I am doing here, I am (slowly) doing a series on it at Medium. Find the intro here: https://medium.com/pragmatic-coding/exporting-sitescope-metrics-into-influxdb-introduction-5104e328ec92

Please or to participate in this conversation.