Min_Khant_Saw's avatar

Min_Khant_Saw wrote a reply+100 XP

4w ago

I found your multipart upload suggestion really helpful thanks for sharing it. After digging into it a bit more, I came across a Laravel package that makes the process much simpler.

This library integrates nicely with Laravel’s built-in storage system and feels more straightforward to use:

https://github.com/mreduar/s3m

It looks like a good fit for handling large file uploads with S3 in a more Laravel-friendly way.

Min_Khant_Saw's avatar

Min_Khant_Saw liked a comment+100 XP

4w ago

@min_khant_saw You’ll need to use multi-part uploading. There are a number of JavaScript libraries that can handle this for you, including AWS’s own JS SDK. The AWS JS SDK repository even has sample code for such: https://github.com/awsdocs/aws-doc-sdk-examples/blob/main/javascriptv3/example_code/s3/scenarios/multipart-upload.js

Min_Khant_Saw's avatar

Min_Khant_Saw started a new conversation+100 XP

1mo ago

I want to handle uploading and downloading large video files from S3 in a way that supports resuming if the client experiences network interruptions or other issues.

If the upload or download is interrupted, the client should be able to resume from the last completed part instead of starting over. I want to know the best practices or tools in Laravel (and optionally JavaScript/React/Inertia) to achieve resumable uploads and downloads.

How can I implement this functionality efficiently for large files stored on S3?