I'm guessing that you have something like the following in your migration code:
// pseudo code
$oldData = \DB::table('old_table')->get();
// do some magical data processing here
foreach($oldData as $row) {
Model::create($row);
}
First getting all old data in memory, then restructuring it, then re-importing it into the DB in the new way.
But when you get all that data into memory, at some point the script overflows your (physical) memory.
Without knowing the migration script, it's hard to say for sure.
If this is the case, you could try to change the way you migrate, for instance: getting the old data in chunks, processing them, inserting them in the new DB, and then loop back for the next chunk.
It might also be that you keep data in memory while it's not needed anymore.
Something like:
$oldData = \DB::table('old_table')->get();
// do some magical data processing here
foreach($oldData as $row) {
Model::create($row);
}
$moreOldData = \DB::table('other_old_table')->get();
// do some magical data processing here
foreach($moreOldData as $row) {
DifferentModel::create($row);
}
If you do not unset() the $oldData variable when getting the next dataset, your memory will start climbing rapidly.
If you cant figure it out, try to anonymize the code if needed and post it here, we could have a look at how to refactor it.