LailaIH's avatar

LailaIH wrote a reply+100 XP

2d ago

thank you

LailaIH's avatar

LailaIH liked a comment+100 XP

2d ago

The single-request approach is a ticking time bomb for max_execution_time and memory exhaustion. Once users start dropping gigabytes of files, your server will choke and browser connections will timeout. You can use Chunked Uploads or Asynchronous Processing

LailaIH's avatar

LailaIH started a new conversation+100 XP

2d ago

I’m building a Laravel 12 folder organizer where users may upload large folders (many files / large sizes), and I categorize files by extension and generate a ZIP for download.

Current approach:

Process everything in one request

Loop through uploaded files once

Add files directly from PHP temp upload paths into a ZIP using ZipArchive::addFile()

For large folder uploads, what’s the best approach for performance and scalability?

Also, what would you do specifically to speed up processing for large uploads?