I've got to get some (potentially) very large files uploaded to my S3 bucket on a Laravel Job I am building out. I am getting the dreaded "Allowed memory size of ### bytes exhausted" error, and I have no interest in increasing the memory limit in php.ini (simply because I don't know how large some of these files will go, and at some point I need to quit running away from these large files by increasing memory_limit to ridiculous levels).
The question is: Does Laravel make chunking this thing easy? Is there a function I am not seeing that I can use?
I know the answer is probably no, but Laravel makes SO many things easy for me, I figured I might ask to see if I was missing something in my Google's.
If this does not exist in Laravel, what should I do? I know that I need to take the file into memory a chunk at a time, but I have no idea where to start on that.