Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

VladPtashnyk06's avatar

Laravel job for exporting millions of records to CSV - how to optimize and avoid memory errors?

Hello. I'm writing a Laravel job that exports product analytics to a CSV file and sends a link to an email. There can be up to 10 million records for a certain period. Currently, I use cursor() to stream the results and fflush() when writing to the file:

public function handle() { \Log::info('Початок експорту аналітики продуктів', [ 'filters' => $this->filters, 'email' => $this->email, 'startDate' => $this->startDate, 'endDate' => $this->endDate, ]);

    $relativePath = 'reports/product_analytics_' . Carbon::parse($this->startDate)->startOfDay() . '_' . Carbon::parse($this->endDate)->startOfDay() . '.csv';
    $absolutePath = storage_path('app/public/' . $relativePath);

    $stream = fopen($absolutePath, 'w');
    $csv = Writer::createFromStream($stream);
    $csv->insertOne([
        'ID товару',
        'Назва товару',
        'Артикул',
        'ID категорії',
        'Назва категорії',
        'Перегляди',
        'Кліки',
        'Купівлі',
        'Дата створення',
    ]);

    $query = ProductAnalytic::with(['product.category'])
        ->whereBetween('created_at', [
            Carbon::parse($this->startDate)->startOfDay(),
            Carbon::parse($this->endDate)->endOfDay(),
        ])
        ->orderBy('id');

    if (!empty($this->filters['title'])) {
        $query->where('p.title', 'like', '%' . $this->filters['title'] . '%');
    }

    if (!empty($this->filters['code'])) {
        $query->where('p.code', $this->filters['code']);
    }

    foreach ($query->cursor() as $row) {
        $product = $row->product;
        $category = $product->category ?? null;

        $csv->insertOne([
            $row->product_id ?? '',
            $product->title ?? '',
            $product->code ?? '',
            $product->category_id ?? '',
            $category->title ?? '',
            $row->views ?? '0',
            $row->clicks ?? '0',
            $row->buy ?? '0',
            optional($row->created_at)->format('Y-m-d H:i:s'),
        ]);

        fflush($stream);
    }

    fclose($stream);

    $url = Storage::disk('public')->url($relativePath);
    Mail::to($this->email)->send(new ProductAnalyticsReportMail($url));

    \Log::info("Файл успішно створено та надіслано на пошту: {$this->email}");
}

Are there any best practices for large CSV exports? Thank you very much.

0 likes
4 replies
VladPtashnyk06's avatar

[2025-05-28 13:50:30] local.ERROR: Allowed memory size of 536870912 bytes exhausted (tried to allocate 85999616 bytes) {"exception":"[object] (Symfony\Component\ErrorHandler\Error\FatalError(code: 0): Allowed memory size of 536870912 bytes exhausted (tried to allocate 85999616 bytes) at /home/valimsit/superprice.ua/www/super-price/vendor/laravel/framework/src/Illuminate/Database/Connection.php:495) [stacktrace] #0 {main} "}

Here is the error

VladPtashnyk06's avatar

$query = ProductAnalytic::query() ->join('products as p', 'product_analytics.product_id', '=', 'p.id') ->leftJoin('categories as c', 'p.category_id', '=', 'c.id') ->select([ 'product_analytics.product_id as product_id', 'product_analytics.views as views', 'product_analytics.clicks as clicks', 'product_analytics.buy as buy', 'product_analytics.created_at as created_at', 'p.title as product_title', 'p.code as product_code', 'p.category_id', ]) ->selectRaw("JSON_UNQUOTE(JSON_EXTRACT(c.title, '$.uk')) as category_title_uk") ->whereBetween('product_analytics.created_at', [ Carbon::parse($this->startDate)->startOfDay(), Carbon::parse($this->endDate)->endOfDay(), ]) ->orderBy('product_analytics.id');

    if (!empty($this->filters['title'])) {
        $query->where('p.title', 'like', '%' . $this->filters['title'] . '%');
    }

    if (!empty($this->filters['code'])) {
        $query->where('p.code', $this->filters['code']);
    }

    foreach ($query->toBase()->cursor() as $row) {
        $csv->insertOne([
            $row->product_id,
            $row->product_title,
            $row->product_code,
            $row->category_id,
            $row->category_title_uk,
            $row->views ?? '0',
            $row->clicks ?? '0',
            $row->buy ?? '0',
            $row->created_at,
        ]);

        fflush($stream);
    }

Сhanged slightly

VladPtashnyk06's avatar

The whole method

public function handle() { \Log::info('Початок експорту аналітики продуктів', [ 'filters' => $this->filters, 'email' => $this->email, 'startDate' => $this->startDate, 'endDate' => $this->endDate, ]);

    $relativePath = 'reports/product_analytics_' . Carbon::parse($this->startDate)->startOfDay() . '_' . Carbon::parse($this->endDate)->startOfDay() . '.csv';
    $absolutePath = storage_path('app/public/' . $relativePath);

    $stream = fopen($absolutePath, 'w');
    $csv = Writer::createFromStream($stream);
    $csv->insertOne([
        'ID товару',
        'Назва товару',
        'Артикул',
        'ID категорії',
        'Назва категорії',
        'Перегляди',
        'Кліки',
        'Купівлі',
        'Дата створення',
    ]);

    $query = ProductAnalytic::query()
        ->join('products as p', 'product_analytics.product_id', '=', 'p.id')
        ->leftJoin('categories as c', 'p.category_id', '=', 'c.id')
        ->select([
            'product_analytics.product_id as product_id',
            'product_analytics.views as views',
            'product_analytics.clicks as clicks',
            'product_analytics.buy as buy',
            'product_analytics.created_at as created_at',
            'p.title as product_title',
            'p.code as product_code',
            'p.category_id',
        ])
        ->selectRaw("JSON_UNQUOTE(JSON_EXTRACT(c.title, '$.uk')) as category_title_uk")
        ->whereBetween('product_analytics.created_at', [
            Carbon::parse($this->startDate)->startOfDay(),
            Carbon::parse($this->endDate)->endOfDay(),
        ])
        ->orderBy('product_analytics.id');

    if (!empty($this->filters['title'])) {
        $query->where('p.title', 'like', '%' . $this->filters['title'] . '%');
    }

    if (!empty($this->filters['code'])) {
        $query->where('p.code', $this->filters['code']);
    }

    foreach ($query->toBase()->cursor() as $row) {
        $csv->insertOne([
            $row->product_id,
            $row->product_title,
            $row->product_code,
            $row->category_id,
            $row->category_title_uk,
            $row->views ?? '0',
            $row->clicks ?? '0',
            $row->buy ?? '0',
            $row->created_at,
        ]);

        fflush($stream);
    }

    fclose($stream);

    $url = Storage::disk('public')->url($relativePath);
    Mail::to($this->email)->send(new ProductAnalyticsReportMail($url));

    \Log::info("Файл успішно створено та надіслано на пошту: {$this->email}");
}
krisi_gjika's avatar

use ->chunk or ->each to get a slice of the results at a time:

$query->each(fn (ProductAnalytic $p) => $csv->insertOne([
    $p->product_id,
    $p->product_title,
    $p->product_code,
    $p->category_id,
    $p->category_title_uk,
    $p->views ?? '0',
    $p->clicks ?? '0',
    $p->buy ?? '0',
    $p->created_at,
  ]))

Please or to participate in this conversation.