@bobbybouwmann
I'm not certain your first solution would work. Let's say the "A-finished" job gets dispatched, and then "B", "B", and "B-finished" are dispatched concurrently. If I had three queue listeners, then all three jobs would get picked up. This would cause "B-finished" to run early.
For you alternative solution, what would trigger the cronjob? My goal here is to essentially have both multi-threading (by running jobs in parallel) and process-dependency (by using chaining or some other approach).
My application currently has 7 processes that need to be chained, and each process currently runs actions in a loop over a series of models. However, each model-specific action is isolated to that model, so I'd like to run that part concurrently.
Here's the sort of workflow I'm expecting:
- Dispatch the first process ("A")
- "A" dispatches 5+ "A-record" processes
- Something ("A", "A-record", or other) dispatches the second process ("B") when all "A-record" processes are complete
- "B" dispatches 300+ "B-record" processes
- Something dispatches the third process ("C"), similar to #3
- "C" dispatches 8000+ "C-record" processes
- etc.
Since my plan here is to use Laravel Vapor in the long run, when something like #6 happens, I'm expecting 8000+ queue listeners to spin up, handle the job, and spin down. I'm currently just operating in a basic Laravel environment, so I'll likely only have 1 to 4 listeners always running until I get this process ironed out. If I were to chain the 8000 job instances together, then they'd run sequentially, rather than in parallel.
I think my biggest problem here is knowing when to dispatch the next process that has a dependency (i.e. #3 and #5). I can envision various ways of tackling this problem, but they all of pros and cons that make me worry that I'm not thinking about the solution in the correct manner (hence why I'm coming here for insight).