Some sort of cpu and memory intensive calculation and reporting algorithm that run for days.
I know it's not really a good idea to implement something like this in php.
I'm still trying to figure out how to pull this out as separate module, perhaps not in php.
I would expect that the --memory option for the queue worker should be less that the PHP CLI process' memory limit. Ultimately, PHP will be in charge of how much memory it can allocate, not the Queue Worker
I just confirmed by making a job that uses 100M memory.
WHEN php.ini's memory_limit=200M is set to higher than the command's --memory=50 parameter setting,
when memory usage exceeds 100M, the job continue to run until it finishes, and the queue worker would exit with code 12.
WHEN the command's queue:worker --memory=200 parameter is set to higher than the php.ini's memory_limit=50M setting, when memory usage exceeds 50M, the job does not continue until it finishes as the worker throw a php exception and exit before the job finishes.
Do not fool youself : the worker --memory parameter DOES NOT set the memory limit of your PHP. It just controls the memory usage of your script against that limit.
The REAL ONLY EFFECTIVE limit is the one set in your php.conf.
The official Laravel documentation misses that point IMO.