Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

jjudge's avatar

Can eloquent lock a whole table?

That's the question :-)

I realise rows can be locked when selecting, by using the lockForUpdate() method in an eloquent query. But what about a whole table?

We have a table that gets updated from a CSV file at regular times, and simply want to lock the table while doing the updates, just in case multiple update processes get run at the same time. The locks will force them to line up and take their turn. The updates are done in a single transaction, so an update process will update the whole table, or none of the table.

Now, we will be deploying to both MySQL and Azure SQL databases. The transaction handling and row locks of eloquent work brilliantly on both databases - the whole process is abstracted so it just works.

But whole table locks? Can't find any documentation on that. Anyone done this? Do I just need to manually check the database type then issue a raw DML table lock command appropriate to the RDBMS? Or does eloquent hold my hand and abstract it? Maybe there is an extension to eloquent that adds this feature?

0 likes
23 replies
Tray2's avatar

You could maybe use something like this.

Db::raw('lock tables mytable write'):

doYourStuff();

Db:raw('unlock tables');

1 like
Tray2's avatar

@Cronix no idea never used it myself in any database. As for Azure never used it but there is probably something similar that can be done.

I'm not even convinced that it's a good option to lock the table at all. I would automate the import and not run it manually that way there shouldn't be any risk of duplicates.

jlrdw's avatar

Just take this site down for a few minutes while this is going on. Or however long is needed.

Cronix's avatar

@tray2 I was just mentioning bc he said the servers will run on both mysql and azure. That's why he was looking for a solution built into the DB query builder (not raw) so it would work cross-platform.

jjudge's avatar

I'm not sure what taking the site down is going solve!

Our application is going to be up 24/7, handling data through APIs, and we can't really afford to take it down on purpose. Handling the database updates are relatively trivial, and can easily be done while the application is running. It's a lookup table, so will be 99% read, with an update at irregular times. The updates add rows with "effective from dates" dated in the future, so we won't be pulling the rug out from any processes currently reading the data. The lock is just a bit of additional safety. I could just create a dedicated "lock table" with a single row in for locking, and use that as the gateway into the updates.

I was mainly just wondering if locking whole tables is something that is handled by eloquent out of the box or not. If not, we may look at at a PR to get it into Laravel.

jjudge's avatar

The transactions work, but solve a different problem. For a start, if two update processes ran at the same time (for whatever reason), and they started updating and inserting the same rows in the table, then they could easily get into a deadlock situation.

The locking of the whole table is to ensure no two jobs are updating the table at the same time, by making a requirement to update the table being be to lock it first.

What the transaction will do, is to hold the lock for the duration of the updates, and then allow all the updates to be committed together, or rolled back together - there will be no half-complete updates; it's everything or nothing, and one job at a time.

jlrdw's avatar

Just use the PDO instance and lock the table, using getPdo ().

Cronix's avatar

@jlrdw and how do you do that in multiple environments using different db servers? If they were all mysql, that would work (since the statement would be the same), but they're not. That's basically what Tray2 suggested in his first post (although absolutely no need to get the actual pdo instance, just use DB::raw() like he was)

jlrdw's avatar

Well compile a c++ class to handle the Locking and unlocking and the database transaction.

I am sure there are many ways to handle this.

What is wrong with @tray2 answer.

jjudge's avatar

I can certainly find out which database type is being used by inspecting the PDO object.

It was worth a look, as PDO supports things like commit, rollback and starting transactions natively. It just doesn't have lock table.

I think I'm leaning towards a lockable row in a table set aside just for handling this. One advantage of doing this is that eloquent provides all the magic abstracting juice for this; it works across all most database types (and works well).

It may not even be an issue in the long run if the update job is running on a queue, and the queue only has one worker.

Cronix's avatar

Well compile a c++ class to handle the Locking and unlocking and the database transaction.

Lol. "I have a laravel/db problem" ... "go code something in c++"

1 like
Cronix's avatar

What is wrong with @tray2 answer.

It's specific to mysql

From first post

Now, we will be deploying to both MySQL and Azure SQL databases.

click's avatar

This is probably not a solution for you but... I solved something similar once with a .lock file on the webserver.

Create the file when you start, remove it when you are done. If you start your process you check if the file exists. If it does you stop/halt/pause the process and try again later.

The only danger here exists if your process stops in the middle without cleaning up the .lock file. You could create fallbacks / notifications for this.

jjudge's avatar

@click that is essentially what I am trying to do, and have done something similar with queued jobs that have small sections that cannot run at the same time (oauth token refreshes are one example, where two jobs could be hitting the same expired oauth token, and end up with one overwriting the new token grabbed by the other).

The advantage of locking a row in the database instead, is that if the process dies, then the lock is released automatically. My initial thought was that since I was updating the table, then it makes sense to lock that table. But thinking about it more, it really does not matter what I lock in the database, so long as all jobs updating that table agree to lock (or attempt to lock) the same thing.

Tray2's avatar

@jlrdw That solution should work in both databases with a bit of different syntax of course.

Cronix's avatar

I'd look at just extending eloquent and adding a lock() method for both mysql/ms rather than try that lol. What a mess.

Cronix's avatar

Maybe just create a helper function for lock/unlock?

function lockTable($table) {
    $connection = DB::connection()->getName();

    if ($connection == 'mysql') {
        $sql = "LOCK TABLES {$table} WRITE";
    }

    if ($connection == 'mssql') {
        $sql = "LOCK TABLE {$table}";
    }

    try {
            \DB::unprepared($sql);
        } catch(Exception $e) {
            //do something
    }
}

and similar for unlock?

then just

lockTable('table');
// do your stuff
unlockTable('table');
jjudge's avatar

@Cronix I like that method. I wouldn't need the unlock if it's used in a transaction:

DB::transaction(function () {
    lockTable('table');

    // Do my stuff.
});

The lock would always be released when leaving the closure.

There is probably a way to extend the DB facade so DB::lockTable('table'); could be used.

Overflow394's avatar

Just run your import jobs in a queue to process them sequentially. I agree that Laravel needs to have this feature for some use cases, but your problem has a better solution.

1 like

Please or to participate in this conversation.