The single-request approach is a ticking time bomb for max_execution_time and memory exhaustion. Once users start dropping gigabytes of files, your server will choke and browser connections will timeout. You can use Chunked Uploads or Asynchronous Processing
Apr 24, 2026
3
Level 1
Best approach for handling large file uploads and ZIP generation in Laravel?
I’m building a Laravel 12 folder organizer where users may upload large folders (many files / large sizes), and I categorize files by extension and generate a ZIP for download.
Current approach:
Process everything in one request
Loop through uploaded files once
Add files directly from PHP temp upload paths into a ZIP using ZipArchive::addFile()
For large folder uploads, what’s the best approach for performance and scalability?
Also, what would you do specifically to speed up processing for large uploads?
Please or to participate in this conversation.