Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

whambulance's avatar

Queued File imports via Laravel Excel over a certain row count not working on Laravel Vapor

Issue:

Using Laravel Excel to upload a ~100k row file via the ToModel method, as well as using ShouldQueue, WithChunkReading, & WithBatchInserts concerns.

Locally, it works fine. Upload jobs are queued to a redis container, and it successfully uploads all ~100k rows to a local docker mysql database.

The issue comes from our live site, which we deploy using Laravel Vapor. Through vapor, our files under a certain number of rows work fine, but when they're just over 29k rows (16 columns, average 4.2 characters per cell) they fail. Cloudwatch's queue log shows no messages, implying that the job doesn't even start. The file is uploaded into AWS S3 via the Laravel Vapor file upload function, and gets uploaded successfully each time.

With vapor, if jobs fail, it adds a row to the failed_jobs table in the applications main database. No rows are added, which again implies that the job isn't started in the first place.

Example:

Importing into the users table, assuming that every field in the table is nullable. This also assumes that it's being run from within AWS after deployment via Laravel Vapor. The current vapor yml I'm using is provided.

vapor.yml:

environments:
  staging:
      queue-timeout: 900
      memory: 1024
      cli-memory: 512
      runtime: 'php-8.0:al2'
      build:
          - 'COMPOSER_MIRROR_PATH_REPOS=1 composer install'
          - 'php artisan event:cache'
          - 'npm ci && npm run dev && rm -rf node_modules'

Controller method to call upload:

public function import(Request $request)
{
    $validStorage = $this->validateStorageDetails($request);

    $upload = ExcelUpload::create([
        'user_id' => Auth::user()->id,
        'table' => 'users',
        'status' => 'pending',
        'completed_at' => null,
    ]);
    
    try {
        // Adding a file extension back onto the file so that laravel excel
        // knows what type of file it is
        $path_parts = pathinfo($validStorage['filename']);
        $s3path = $validStorage['key'] . "." . $path_parts['extension'];
        Storage::disk('s3')->move($validStorage['key'], $s3path);

        Excel::queueImport(new UserImport($upload->id), $s3path, 's3')
            ->chain([
                // Called to mark the $upload as complete
                new CompleteExcelUpload($upload->id, UserImport::class)
            ]);

    } catch (Exception $e) {
        $upload->status = 'failed';
        $upload->save();
        return response()->json(['message' => $e], 500);
    }

    return response()->json(['message' => 'uploaded successfully'], 200);
}

Validation method for storage details:

public function validateStorageDetails()
{
    return request()->validate([
        'uuid' => ['required', 'string'],
        'key' => [
            'required', 
            'string',
            // The file being uploaded isn't an xls file, it's xlsx - I 
            // know that there are xls issues with PHPSpreadsheet
            new StorageFile(20000000, ['application/xlsx', 'application/xls'])
        ],
        'bucket' => ['required', 'string'],
        'filename' => ['required', 'string'],
        'content_type' => ['required', 'string'],
    ]);
}

UserImport class:

class UserImport implements ToModel, WithHeadingRow, WithChunkReading, WithBatchInserts, ShouldQueue, WithEvents, WithValidation
{
    use Importable

    private $upload_id;

    public function rules(): array
    {
        return [
            'username' => [
                'required',
                'string',
            ],
        ];
    }

    /**
     * @param array $row
     *
     * @return \Illuminate\Database\Eloquent\Model|null
     */
    public function model(array $row)
    {
        HeadingRowFormatter::extend('custom', function ($value, $key) {
            return trim(strtolower($value));
        });

        $newdata = [
            'email' => $row['email'] ?? null,
            'username' => $row['username'] ?? null,
        ];
        
        return new User($newdata);
    }

    public function batchSize(): int
    {
        return 1000;
    }

    public function chunkSize(): int
    {
        return 5000;
    }
}

I've tried a couple of different things, not too sure whats wrong. Is there a config change I need to make that i'm unaware of?

0 likes
1 reply
edstevo's avatar

Hey - did you get anywhere with this? I'm having a similar issue queuing the job to SQS via vapor. The job fails because of "Entity Too Large" upload issues. I've read that I should be using "ShouldQueueWithoutChain" instead of just "ShouldQueue".

Looking for more insight...

Please or to participate in this conversation.