LailaIH wrote a reply+100 XP
2w ago
LailaIH liked a comment+100 XP
2w ago
@lailaih You need to look into multi-part/chunked uploading for large uploads. This is where the file is sent in multiple chunks (and requests), and then re-assembled server-side. It then means any failing chunks can be re-tried if they fail, and the entire upload paused and resumed.
LailaIH wrote a reply+100 XP
3w ago
LailaIH liked a comment+100 XP
3w ago
The single-request approach is a ticking time bomb for max_execution_time and memory exhaustion. Once users start dropping gigabytes of files, your server will choke and browser connections will timeout. You can use Chunked Uploads or Asynchronous Processing
LailaIH started a new conversation+100 XP
3w ago
I’m building a Laravel 12 folder organizer where users may upload large folders (many files / large sizes), and I categorize files by extension and generate a ZIP for download.
Current approach:
Process everything in one request
Loop through uploaded files once
Add files directly from PHP temp upload paths into a ZIP using ZipArchive::addFile()
For large folder uploads, what’s the best approach for performance and scalability?
Also, what would you do specifically to speed up processing for large uploads?