NotAGoodCoder
1 year ago

Processing chunked uploads help please

Posted 1 year ago by NotAGoodCoder

So I'm accepting user-uploaded videos for a new internal service that I'm creating. Due to the size of most videos, I am using Dropzone.js and the chunked upload functionality that it implements. Right now, my store method receives the uploaded 20MB chunk, stores it in storage/app/processing with a semi-sanitized filename, and keeps doing that for all chunks that are sent. On the final chunk, I need to combine all of the uploaded chunks into one file, move that file to a "transcoding" directory, and then fire off a job to start up the transcoder. I've got the beginning and the end down, but it's the middle part that I'm stuck on implementing, specifically on how to do it the "laravel way". Right now I am just looping through each of the chunks and reading them and then writing them to a new temporary file, then moving that file to the transcoding directory. My problem is that is hugely memory intensive and I feel like there is a more sane way to handle this. In https://laravel.com/docs/5.5/filesystem#storing-files, the doc says that using streams is recommended for large files, but I'm not sure what this means or how to implement it. Can someone take a look and help me improve the performance and memory footprint of this?

Here is my store method in my controller:

public function store(Request $request)
    {
        $file = $request->file('video');
        //Hash the dzuuid in md5 just to make sure we aren't storing a tampered uuid from the request
        $processing_filename = md5($request->dzuuid);

        $file->storeAs('processing', $processing_filename . '.' . $request->dzchunkindex . '.chunk');

        if($request->dzchunkindex == $request->dztotalchunkcount - 1) {
            //Finished Uploading. Assemble file and send to transcoder
            //Create an empty file for storing assembled contents
            Storage::put("processing/$processing_filename", '');

            for($i = 0; $i <= $request->dzchunkindex; $i++) {
                $contents = Storage::get("processing/$processing_filename.$i.chunk");
                Storage::append("processing/$processing_filename", $contents);

                //Delete chunk
                Storage::delete("processing/$processing_filename.$i.chunk");
            }

            //Now move that newly assembled file to the storage/app/transcoding directory
            Storage::move("processing/$processing_filename", "transcoding/$processing_filename");

            //Now create record and send new record and the temporary filename to the transcoding job
            $video = Video::create(['user_id' => '1', 'title' => 'test', 'status' => 'processing', 'description' => 'test description']);
            TranscodeVideo::dispatch($video, $processing_filename);
        }
    }

This is just a first draft of the implementation, so I'm open to suggestions. This is what I came up with after a quick think about the situation, and I'm moderately happy with it. I will have a cron job running to clear out old, unprocessed uploads every 24 hours. My only hangup is how to best assemble and then move the uploaded chunks before transcoding.

Please sign in or create an account to participate in this conversation.