Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

LailaIH's avatar

Best approach for handling large file uploads and ZIP generation in Laravel?

I’m building a Laravel 12 folder organizer where users may upload large folders (many files / large sizes), and I categorize files by extension and generate a ZIP for download.

Current approach:

Process everything in one request

Loop through uploaded files once

Add files directly from PHP temp upload paths into a ZIP using ZipArchive::addFile()

For large folder uploads, what’s the best approach for performance and scalability?

Also, what would you do specifically to speed up processing for large uploads?

0 likes
5 replies
imrandevbd's avatar

The single-request approach is a ticking time bomb for max_execution_time and memory exhaustion. Once users start dropping gigabytes of files, your server will choke and browser connections will timeout. You can use Chunked Uploads or Asynchronous Processing

1 like
martinbean's avatar

@lailaih You need to look into multi-part/chunked uploading for large uploads. This is where the file is sent in multiple chunks (and requests), and then re-assembled server-side. It then means any failing chunks can be re-tried if they fail, and the entire upload paused and resumed.

1 like
MuhammadAli2003's avatar

@lailaih The suggestions about chunked uploads and async processing are definitely the right direction, but there’s another layer to consider once file sizes or volume increase.

The main issue isn’t just uploading — it’s everything happening after that:

– ZIP generation blocking the request – disk I/O spikes when handling multiple large files – downloads becoming slow or inconsistent under load

A more scalable approach is to split the flow:

  1. Upload (chunked and stored safely)
  2. Queue processing (ZIP generation in the background)
  3. Serve the final file separately instead of generating it on demand

Also, serving large ZIP files directly from the application layer can become a bottleneck. Separating delivery from processing usually improves stability and response times.

One thing worth thinking about — how are you planning to handle cleanup of temporary files in this setup without risking deleting something still in use?

Please or to participate in this conversation.