Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

s.lavoie.b@gmail.com's avatar

How to handle large file uploads to s3?

I have a fairly basic setup for file uploads to S3. A dedicated Fileclass to which are sent uploaded files. Everything works, but when uploading big files, I run out of memory. As I understand (and I'm not sure I really understand), it would be because of file_get_contents. As the docs says:

The put method may be used to store a file on disk. You may also pass a PHP resource to the put method, which will use Flysystem's underlying stream support. Using streams is greatly recommended when dealing with large files:

Storage::put('file.jpg', $contents);

Storage::put('file.jpg', $resource);  // <-- What is this?

I suppose $contents represents my current file_get_contents($uploadedFile) but how do I pass a resource? Should I use fopen? Or use S3 multipart uploader? I am not sure what direction to take here. As an information the file size, would be around 100 to 300 mb.

Here is my file:

    /**
     * Uploads a file.
     *
     * @param  UploadedFile $uploadedFile
     * @return $this
     */
    public function upload(UploadedFile $uploadedFile)
    {
        $s3Uploadfilepath = "path/to/my/filename.ext";

        $s3Upload = Storage::disk('s3')->put($s3Uploadfilepath, file_get_contents($uploadedFile));

        return $this;
    }

Thanks :)

0 likes
6 replies
s.lavoie.b@gmail.com's avatar

Well, as usual asking the question helps finding the answer. Here’s the answer for those who might search for the same thing. Instead of using `file_get_contents use ` fopen($uploadedFile->getRealPath(), 'r+'.

Although the answer is not in the Laravel documentation, you can learn a lot more in the flysystem package guide: http://flysystem.thephpleague.com/recipes/.

I would still be interested on how to use the S3 multipart uploader with Laravel though.

3 likes
rhand's avatar

Did you every find out about the multipart upload using S3 / PHP Flysystem @s.lavoie.b@gmail.com ?

martinbean's avatar

@rhand Use multi-part uploading with JavaScript: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/s3-example-photo-album-full.html

The example uses photos, but you can use the same approach for any type of file, including video. I use this approach myself in my own video on demand platform.

You don’t want to be trying to upload massive video files via PHP as, like the OP found out, you’ll just hit memory and timeout issues.

2 likes
rhand's avatar

@martinbean Thanks a lot for this tip. Really appreciate this. Reason I was looking for this is to deal with Spatie's Laravel Backups being sent to Digital Ocean Spaces and hitting rate limits. Seems I need multi part uploads to avoid these rate limits for larger files.

We are using a Laravel Vue App and storing on disk is done with PHP Flysystem using the S3 Adapter. But I think we should be able to use the JavaScript SDK as well to communicate with Spaces using their API as it fully supports all S3 methods. That is the reason why "league/flysystem-aws-s3-v3": "^1.0", is used after all.

And we sure do want to avoid memory issues with PHP. That is why we used Dropzone and client side methods for image resizing already. Only current setup to store using Storage facade is clearly to basic causing issues with large backup files as well as smaller images files. Also looking into serializing PutObjects, but that is another story that could use its own thread...

Also heard using putFile or streaming of files >= 5 MB is good. Still reading more on that topic as well.

Please or to participate in this conversation.