Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

shalu_104's avatar

How to handle large sequential data in queue

I needed help with the following I have a large amount of data that is sequentially dependent on previous data. If we update any data from between all the records after that should be updated accordingly.

Records can be as follows

Id.           Value1.     Value2
R1                10.             20
R2.               20.            35
R3.              35.             42

So here if we update value2 of R1 as 21 then changes will be as follows.

Id.           Value1.     Value2
R1                10.             21
R2.               21.           36
R3.              36             43

For now, I'm using the Redis queue for this but most of the time queue fail due to large data and creates issues. Is there any efficient way to achieve this? There are thousands of data which makes it difficult to achieve. Thank you.

0 likes
2 replies
Niush's avatar

I don't know how you are handling it in queue jobs. But, you can have single DB query to increment integer.

Something like this:

// First update R1.Value2

$increment_by = 1; // or whatever

\DB::table('records')->setBindings(['increment_by' => $increment_by])
->where('id', '>', 1) // e.g. IDs greater then R1
->update(
    [
        'Value1' => \DB::raw("Value1 + :increment_by"),
        'Value2' => \DB::raw('Value2 + :increment_by'),
    ]
);
1 like

Please or to participate in this conversation.