[email protected]
3 years ago
122
1
Laravel

How to handle large file uploads to s3?

Posted 3 years ago by [email protected]

I have a fairly basic setup for file uploads to S3. A dedicated Fileclass to which are sent uploaded files. Everything works, but when uploading big files, I run out of memory. As I understand (and I'm not sure I really understand), it would be because of file_get_contents. As the docs says:

The put method may be used to store a file on disk. You may also pass a PHP resource to the put method, which will use Flysystem's underlying stream support. Using streams is greatly recommended when dealing with large files:

Storage::put('file.jpg', $contents);

Storage::put('file.jpg', $resource);  // <-- What is this?

I suppose $contents represents my current file_get_contents($uploadedFile) but how do I pass a resource? Should I use fopen? Or use S3 multipart uploader? I am not sure what direction to take here. As an information the file size, would be around 100 to 300 mb.

Here is my file:

    /**
     * Uploads a file.
     *
     * @param  UploadedFile $uploadedFile
     * @return $this
     */
    public function upload(UploadedFile $uploadedFile)
    {
        $s3Uploadfilepath = "path/to/my/filename.ext";

        $s3Upload = Storage::disk('s3')->put($s3Uploadfilepath, file_get_contents($uploadedFile));

        return $this;
    }

Thanks :)

Please sign in or create an account to participate in this conversation.