Snapey wrote a reply+100 XP
1w ago
Snapey liked a comment+100 XP
1mo ago
Hey everyone,
I got tired of debugging Laravel database queues blind — no visibility, just raw SQL queries and crossed fingers.
Horizon is great but requires Redis, which isn't always an option on shared hosting or smaller projects. So I built a lightweight alternative.
What it does
Lightweight Queue Inspector gives you a debugging dashboard for Laravel apps using the database queue driver:
- Pending jobs viewer with collapsible payload inspection
- Failed jobs with full exception messages, stack traces, retry and delete
- Successful jobs with execution time (colour coded) and memory usage
- Dashboard stats — pending count, failed count, avg execution time, top failing job
- Filters by queue name and job class
- Security warnings in the terminal if auth middleware is missing
- Laravel 10, 11 and 12 compatible
Install
composer require abinashbhatta/lightweight-queue-inspector
php artisan migrate
Then visit /queue-inspector. Done — no Redis, no extra config.
Would love feedback from the community — what features would make this more useful for your projects?
Snapey wrote a reply+100 XP
1mo ago
Snapey wrote a reply+100 XP
1mo ago
Snapey wrote a reply+100 XP
1mo ago
Snapey wrote a reply+100 XP
1mo ago
Snapey was awarded Best Answer+1000 XP
1mo ago
Snapey was awarded Best Answer+1000 XP
1mo ago
Fortify generates a unique code for the user, and this is stored in the users table under two_factor_secret. This is then used to create the QR code to initialise the Time based One Time Password generator (TOTP).
As long as the secret stays secret (on the server) then if should not be possible to generate another TOTP.
Snapey wrote a reply+100 XP
1mo ago
Snapey wrote a reply+100 XP
1mo ago
Snapey wrote a reply+100 XP
1mo ago
One reason not to use Migrations for one-time data transforms. Use a command for this and run it manually.
Make sure such data transform commands are idempotent - that is, you can run them many times and the result will be the same.
Get used to looking at the migrations table. Sometimes you can only get out of a hole by altering the table (adding or removing a row).
But you can always revert to the backup... ? You have backups? Especially the one you took before transforming data?
You had the site in maintenance mode when you did this so that the data was not being changed whilst you worked?
Snapey wrote a reply+100 XP
1mo ago
Fortify generates a unique code for the user, and this is stored in the users table under two_factor_secret. This is then used to create the QR code to initialise the Time based One Time Password generator (TOTP).
As long as the secret stays secret (on the server) then if should not be possible to generate another TOTP.
Snapey wrote a reply+100 XP
1mo ago
Snapey wrote a reply+100 XP
1mo ago
Snapey wrote a reply+100 XP
1mo ago
Snapey wrote a reply+100 XP
1mo ago
password only can be captured and replayed in attacks at a later date. Most 2FA solutions are time sensitive, so even if someone is watching you type, they cant know what the code might be in the future.
People have their favourite passwords, so a compromise on one system can lead to other break-ins.
Password only is vulnerable to password reset tactics. If a bad guy can intercept or eavesdrop on your email, they can attempt password reset. Not so with 2fa
Snapey started a new conversation+100 XP
2mos ago
I used Claude to build a project from scratch.
When it created the migrations, it created them in batches, problem was, each batch got the same timestamp. As these tables were related, and had constraints, after I cloned the project and tried to migrate the database, it fell over a couple of times because of table dependencies in constraints.
Anyone encountered this and found a way to tell the agent to wait a second between file creation?
Snapey wrote a reply+100 XP
2mos ago
I have a customer, a smart guy, but a physician not a developer. I have had a project for him since Laravel 7. It uses Livewire and mysql. We worked on 5his site for over 6 years.
Recently he sent me a bunch of files and asked If I could add this huge new feature that he had 'built' using claude. Problem is, its nextjs, react, postgres and a whole bunch of libraries for authentication, subscriptions, analytics etc etc. Of course he did this in good faith, with no idea what he was asking. The files also contained information about how many hours I would need to deploy it!
Obviously the answer is no, but this has damaged our relationship, and Im not sure how to handle it. I could ask claude to convert it, but them I am missing all the project requirements. I don't know how it should behave to produce a quality output,
Snapey wrote a reply+100 XP
2mos ago
Snapey wrote a reply+100 XP
2mos ago
Snapey wrote a reply+100 XP
2mos ago
Snapey was awarded Best Answer+1000 XP
2mos ago
Snapey wrote a reply+100 XP
2mos ago
Snapey wrote a reply+100 XP
2mos ago
I have a scenario with a 'warm' standby server located on a VPS in a different datacentre. It does not need to be synced in real time as all data is transient, but it does need to be daily synced.
A console command on the Live system uses Spatie DBDumper to create a dump of certain tables into a file on an S3 bucket. Both systems have access to the same bucket. At a time after the dump was created, the standby system pulls the file from S3 and applies it to its copy.
I've pasted the two commands below. They are pretty simple to understand.
DumpDB.php
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\Storage;
use League\Flysystem\MountManager;
use Spatie\DbDumper\Databases\MySql;
class DumpDb extends Command
{
protected $signature = 'app:dumpdb';
protected $description = 'Creates a copy of the database so that it can be picked up by the standby';
public function __construct()
{
parent::__construct();
}
public function handle()
{
$filename= 'latest-db.sql';
MySql::create()
->setDbName(config('database.connections.mysql.database'))
->setUserName(config('database.connections.mysql.username'))
->setPassword(config('database.connections.mysql.password'))
->setDumpBinaryPath(config('database.connections.mysql.dump.dump_binary_path'))
->includeTables(['system_user','systems','users','lookups','features','consignments', 'postcodes', 'timeanddistance', 'reports', 'sites'])
->dumpToFile(storage_path('app/db/' . $filename));
$mountManager = new MountManager([
's3' => Storage::disk('s3')->getDriver(),
'local' => Storage::disk('local')->getDriver(),
]);
Storage::disk('s3')->delete($filename);
$mountManager->copy('local://db/'.$filename, 's3://'.$filename);
return 0;
}
}
PullDb.php
<?php
namespace App\Console\Commands;
use App\System;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\DB;
use Illuminate\Support\Facades\File;
use Illuminate\Support\Facades\Log;
use Illuminate\Support\Facades\Storage;
use League\Flysystem\MountManager;
class PullDb extends Command
{
protected $signature = 'app:pulldb';
protected $description = 'fetches database copy from S3 and inserts into local database';
public function __construct()
{
parent::__construct();
}
public function handle()
{
$this->info('Getting the backup.');
$filename= 'latest-db.sql';
//get copy of the database from server
$mountManager = new MountManager([
's3' => Storage::disk('s3')->getDriver(),
'local' => Storage::disk('local')->getDriver(),
]);
if(File::exists(storage_path('app/db/' . $filename))) {
File::delete(storage_path('app/db/' . $filename));
}
$mountManager->copy('s3://' . $filename, 'local://db/' . $filename);
//insert db into local database
$this->info('Importing...');
//run the sql script to import the database tables.
try {
DB::beginTransaction();
DB::unprepared(File::get(storage_path('app/db/' . $filename)));
DB::commit();
} catch (\Throwable $th) {
DB::rollBack();
Log::error('Importing the production database from the copied file failed');
}
Log::info('successfully imported partial database from the live system');
return 0;
}
}
Snapey wrote a reply+100 XP
2mos ago
Snapey wrote a reply+100 XP
2mos ago
If you were inserting 500 rows one by one into a table of 2 million rows with unique constraints then it would slow it considerably.
It would also be slow if for each insert you were also querying other tables to resolve foreign key values.
But as you are providing next to zero information, its hard to guess.
Snapey wrote a reply+100 XP
2mos ago
Snapey wrote a reply+100 XP
2mos ago
Snapey wrote a reply+100 XP
2mos ago
I got fed up with deployment issues with browsershot type solutions, so switched to an api from https://www.neutrinoapi.com/
Their HTML render service does it quickly and easily and has been 100% reliable
Snapey wrote a reply+100 XP
2mos ago
Snapey was awarded Best Answer+1000 XP
2mos ago
imagine a mid sized business that has a number of roles. Sometimes there are job overlaps. You are a manager but sometimes you need to perform certain functions that normally the accounts team would do such as approving a new supplier. Unless you can give this manager specific individual permissions, you end up needing to give them the whole accounts role as well as their manager role, or you have to create a new role that is all the manager permissions plus the one accounting permission needed.
It is also useful for times of sickness or holiday cover. You can temporarily assign someone additional permissions without also giving it to other managers.
There is nothing you can do with direct permissions that you cannot do by creating a new role and then assigning that role to one person. Its just more clumsy and harder to train for.
Snapey wrote a reply+100 XP
2mos ago
imagine a mid sized business that has a number of roles. Sometimes there are job overlaps. You are a manager but sometimes you need to perform certain functions that normally the accounts team would do such as approving a new supplier. Unless you can give this manager specific individual permissions, you end up needing to give them the whole accounts role as well as their manager role, or you have to create a new role that is all the manager permissions plus the one accounting permission needed.
It is also useful for times of sickness or holiday cover. You can temporarily assign someone additional permissions without also giving it to other managers.
There is nothing you can do with direct permissions that you cannot do by creating a new role and then assigning that role to one person. Its just more clumsy and harder to train for.
Snapey wrote a reply+100 XP
2mos ago
Snapey wrote a reply+100 XP
2mos ago
Snapey wrote a reply+100 XP
2mos ago
chatgpt offers these;
Golden rules for using LAST_INSERT_ID().
✅ Read it immediately
✅ On the same connection
✅ Capture it into a variable
❌ Don’t assume ordering, contiguity, or meaning
❌ Don’t rely on it across triggers or pooled connections
The main point being, the framework inserts and gets the id on adjacent lines on the same connection. Your typical application code might call eloquent to insert a row, and then potentially thousands of executed lines later, they try to get the last insert id.
What is dangerous is that with one user testing, this might work perfectly well, but as the app usage grows or gets more complex, you start to get issues. Best avoided, and probably not what the OP needed anyway.
Snapey was awarded Best Answer+1000 XP
2mos ago
I think about it this way. If you just want to check ABILITY then use a Gate. ie, can this user edit Users, yes or no? Use policies if you need control at the model level, ie can this user edit Users from team X. Your policy can consume permissions directly or via gates.
Gate = general ability.
Policy = ability to do something with specific thing.
Snapey wrote a reply+100 XP
2mos ago
I think about it this way. If you just want to check ABILITY then use a Gate. ie, can this user edit Users, yes or no? Use policies if you need control at the model level, ie can this user edit Users from team X. Your policy can consume permissions directly or via gates.
Gate = general ability.
Policy = ability to do something with specific thing.