Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

Bakanyaka's avatar

Handling large file download

Hello. I need to allow users to download large files (1GB and more). Atm I'm just using laravel Storage facade implementation. Basically it is:

 return Storage::download("uploads/{$upload->uuid}.zip");

inside of download controller. What I'm getting is very different behavior on my development and production server. I'm using laradock as development server and manually configured nginx + php_fpm on production server. Both servers have same php memory limit settings. But on production server my app crashes with:

Allowed memory size of 268435456 bytes exhausted (tried to allocate 535699456 bytes

while on development server it works just fine and memory usage doesn't even increase when downloading file. I guess it does have something to do with server configuration but I don't even know where to look. Any suggestions?

0 likes
5 replies
Sirik's avatar

Try this

https://laravel.com/docs/6.x/responses#file-downloads

Streamed Downloads

Sometimes you may wish to turn the string response of a given operation into a downloadable response without having to write the contents of the operation to disk. You may use the streamDownload method in this scenario. This method accepts a callback, file name, and an optional array of headers as its arguments:

return response()->streamDownload(function () {
    echo GitHub::api('repo')
                ->contents()
                ->readme('laravel', 'laravel')['contents'];
}, 'laravel-readme.md');
Bakanyaka's avatar

It clearly not the problem with php code because same code works on dev server and it is processing files which are even larger than my whole docker VM memory.

And as far as I can see, laravel's Storage::download already using streamed responses behind the end:

    /**
     * Create a streamed response for a given file.
     *
     * @param  string  $path
     * @param  string|null  $name
     * @param  array|null  $headers
     * @param  string|null  $disposition
     * @return \Symfony\Component\HttpFoundation\StreamedResponse
     */
    public function response($path, $name = null, array $headers = [], $disposition = 'inline')
    {
        $response = new StreamedResponse;

        $filename = $name ?? basename($path);

        $disposition = $response->headers->makeDisposition(
            $disposition, $filename, $this->fallbackName($filename)
        );

        $response->headers->replace($headers + [
            'Content-Type' => $this->mimeType($path),
            'Content-Length' => $this->size($path),
            'Content-Disposition' => $disposition,
        ]);

        $response->setCallback(function () use ($path) {
            $stream = $this->readStream($path);

            while (! feof($stream)) {
                echo fread($stream, 2048);
            }

            fclose($stream);
        });

        return $response;
    }

I managed to solve the problem by setting

output_buffering = 0

in php.ini on production server. But I still don't get why it works on dev server. phpinfo() shows that output_buffering had same setting on both servers before which was 4096.

MassimoGnocchi's avatar

Hi there, Joining the thread because I had to use your same solution, setting output_buffering = 0. Storage::disk('local')->download('file_path') will result in a web error otherwise.

I still do not understand why. But why?

Massimo.

Bvanhaastrecht's avatar

Hi, I would like to bump this. We are also experiencing this problem. We cant download files larger then about 1.5GB with Storage::download('file.zip'). The browsers stops the download with a Network Error. We are able to download this file without errors from the /public folder.

We tried to set output_buffering = 0 but that gave us more troubles with downloading.

The system is not pressured, has more then enough memory/CPU resources available.

Please or to participate in this conversation.