simotion wrote a comment+100 XP
1w ago
@Niush No it's not. Tailwind uses 'cursor: default' for buttons as a default. Even says so in the documentation you linked to 😉
simotion wrote a comment+100 XP
1mo ago
Nice series!
You forgot cursor-pointer on the buttons ;)
simotion wrote a reply+100 XP
3mos ago
You're running the php artisan command in the NGINX container instead of the PHP container, that's why you getting the error message.
simotion wrote a reply+100 XP
3mos ago
Bit rusty on Docker, but I'll do my best to help.
Have you verified that the public/storage symlink exists inside the Nginx container, not just the PHP container?
Since you're using separate Nginx and PHP-FPM containers, php artisan storage:link runs in the PHP container — but Nginx is the one serving static files.
In your Nginx Dockerfile you only copy:
COPY --from=builder /var/www/public /var/www/public
That means:
The storage directory is not copied
The symlink created in the PHP container is not present
There is no shared volume between containers
So when the browser requests /storage/..., Nginx simply doesn’t have that path — hence the 404.
You have a few options:
Share the entire project directory as a Docker volume between Nginx and PHP
Build both containers from the same source layer so storage and the symlink exist in both
If these are static assets that ship with your app (like quote.svg), consider placing them directly in /public/img instead of storage
Right now, Nginx has no access to the storage directory, which is almost certainly the issue.
simotion wrote a reply+100 XP
3mos ago
I'm really curious about that function now!
simotion liked a comment+100 XP
3mos ago
I think an important reason could be that the idea is, that a Laravel application theoretically is supposed to be db independent.
simotion wrote a reply+100 XP
3mos ago
Not exactly a black and white answer to that. A week could be time well spend to fix technical debt before interest adds up.
Does the pending work require you to interact with that code? For example, does the code you have now contain service classes that you need to use in your pending work?
If so, take time to refactor, because delaying it will increase your technical debt.
If it's very loosely coupled, works and doesn't have an impact on your pending work; I would focus on finishing the MVP and refactor later on.
No code is ever perfect, and perfection is the enemy of good. Striving for perfection will delay your product and many projects have never been finished that way. Trust me, we've all been there.
Get your MVP ready. Once there's cashflow start refactoring.
My strategy while coding is:
- Abstract where needed
- form validation always in formrequest classes
- business logic in service classes immediately
- keep controllers (c)lean
- build controllers for every endpoint instead of using closures inside your routes file
Even if the service classes or form request classes get messy, it's contained and you can optimize later on. Even if a model becomes a god class, it's contained to that model.
Just make sure you don't cram everything from validation to business logic inside the controller(s), because that gets messy real quick.
If a service class get's messy, it's easy to optimize later on.
Oh, en build tests from the get go. Your future self will thank you for it.
simotion started a new conversation+100 XP
3mos ago
There’s a lot of information out there about building a SaaS, but guidance on managing multiple SaaS products in a centralized and scalable way seems scarce—or maybe I’m just not using the right search terms.
In preparation for the public launch of my first SaaS application, I’m building a central management platform which I'm naming 'MCP' -a slight nod to one of my all-time favorite movies, TRON- to manage deployments, accounts, subscriptions, and billing in a centralized and scalable way. Right now, I’m focusing on the architecture of the communication between the MCP and the deployments of my application(s).
Here’s a high-level overview of how I envisioned it to work:
-
Application servers only know which features are available and which limits apply per account.
-
Subscription plans, pricing, addons, and billing are managed centrally in the MCP.
-
When an account upgrades or purchases a subscription/addon, the MCP sends the new set of features and limits to the relevant deployment via signed API requests (HMAC/JWT).
-
The deployment server temporarily caches the payload and sends a SYN-ACK-like message back to confirm receipt. Only after receiving an ACK from the MCP are the new features applied to the account. The MCP only sends an ACK if it initiated the request.
-
FQDN-based communication:
-
Deployments know the MCP’s FQDN and direct all requests/responses there.
-
The MCP knows the main FQDN of each deployment; responses are always sent to this known FQDN.
-
Requests arriving via other FQDNs or IPs (e.g., through a CNAME) are accepted, but the sender in the request header is never used for the response.
-
Authenticity is ensured via the SYN-ACK/ACK handshake between verified URLs.
-
-
Server-to-server only: All communication between MCP and deployments happens directly between the backends. No client-side code ever communicates with the MCP.
The idea is that, even if a key or secret is intercepted, an unauthorized party cannot simply grant an account full access to all features.
💡 My questions for you:
-
How do you manage multiple SaaS products and deployments? Have you centralized it, and if so, what strategy did you use?
-
What strategies do you use to secure server-to-server communication between deployments?
-
Are there best practices for additional verification, key management, or secure feature provisioning that I might be missing in my approach?
simotion liked a comment+100 XP
4mos ago
simotion wrote a reply+100 XP
4mos ago
I think I may have given the wrong impression by referring to the new model as “history” in my earlier posts.
It’s called ProductPrice rather than PricingLog or PricingHistory for a reason. It isn’t merely a log; it’s domain data. Instead of updating a single row in place, price changes are stored as new entries. This keeps historic prices in the database primarily for auditing purposes, but decoupling prices from the products table also enables future use cases the original architecture didn’t support.
For example, the model could support temporary prices (e.g., Black Friday deals) via a nullable valid_until column.
The part I’m struggling with is whether the current price should be derived from this model (making it the conceptual source of truth), or whether it’s acceptable to keep a denormalized current value on the product purely for performance reasons.
Keeping denormalized columns stores current prices in two places — on Product and on ProductPrice. It adds complexity to the codebase and isn’t as explicit as keeping prices solely in the prices table, in exchange for faster reads in product-heavy parts of the application.
At what point would you say the added complexity isn’t worth the performance gain? For instance, with temporary prices, a denormalized approach would require a scheduled task to detect expired prices and update the product table back to the last valid price, whereas deriving the current price from the ProductPrice model wouldn’t. I’ve also considered caching, but that feels like pushing domain logic into cache management.
Curious how others would draw that line.
simotion liked a comment+100 XP
4mos ago
simotion wrote a reply+100 XP
4mos ago
Yes, I would advice making a habit out of it. It makes your code robust, self documenting and makes your life easier during debugging and coding (autocomplete).
No need for docblocks if you have good attribute names with typehinting and typehinted the return types. Helps making your code easy to read.
simotion liked a comment+100 XP
4mos ago
The more specific and explicit you are the easier your life is. So, use types everywhere you can.
simotion wrote a reply+100 XP
4mos ago
To add some more context, this is how I currently persist pricing history from the Product model.
On creation I store the initial prices:
protected static function booted()
{
static::created(function (Product $product) {
ProductPrijs::add($product, PrijsType::INKOOP, $product->prijs_inkoop);
ProductPrijs::add($product, PrijsType::VERKOOP, $product->prijs_verkoop);
ProductPrijs::add($product, PrijsType::ADVIES, $product->prijs_advies);
});
}
And when a price is updated, I append a new ProductPrijs record only if the value actually changed:
public function setPrijsVerkoopAttribute($value)
{
$oldPrice = $this->getOriginal('prijs_verkoop');
$newPrice = Price::fromFloat($value);
$this->attributes['prijs_verkoop'] = $newPrice->getValue();
if ($this->exists && $oldPrice && ! $oldPrice->equals($newPrice)) {
ProductPrijs::add($this, PrijsType::VERKOOP, $newPrice);
}
}
…and similar mutators for the other two price types.
This keeps the history append-only and avoids unnecessary writes.
I’m aware this introduces side effects (using a mutator), but this was a conscious trade-off for now. The main thing I’m trying to balance is keeping product reads fast (no joins or global scopes on heavy product listings) while still having a complete and reliable pricing history.
Curious how others would structure this — especially around where this responsibility should live (mutator vs observer vs service), and whether you’d keep the denormalized price columns in this scenario.
simotion started a new conversation+100 XP
4mos ago
I originally had three price columns on my products table: price_purchase, price_sell and price_advice.
To support pricing history, I introduced a ProductPrice model. The product_prices table contains: • id • product_id • price_type (purchase / sell / advice) • price_value • publisher_id • publisher_name (cached for when a user gets deleted) • timestamps
Whenever a product is created or updated, I check whether a price changed and, if so, I insert a new ProductPrice record. This part works well and gives me full pricing history per product.
The dilemma I’m facing now is where the single source of truth should live.
If I fully rely on the ProductPrice model for “current” prices, I need to load relational data for every product. Eager loading all prices caused a significant memory increase on product index and inventory management pages. Even when limiting this to only the most recent price per type (using scopes or subqueries), it quickly adds up to three extra lookups per product.
With a growing user base (±50 concurrent users) and fairly heavy product listings and inventory views, I’m worried about the long-term performance impact of always deriving the current price from the new table.
The alternative is to keep the original price columns on the products table and treat them as denormalized “current price” fields, while using product_prices purely for historical insight (charts, audits, etc.). This would avoid joins and additional queries entirely for most product-related logic.
However, this feels conceptually “wrong” to me, since the ProductPrice model then isn’t the single source of truth anymore.
I’m curious how others would approach this: • Would you denormalize and accept the duplication for performance? • Use subqueries or views to project current prices? • Or structure this differently altogether?
How would you optimize this without sacrificing performance?
simotion liked a comment+100 XP
5mos ago
Here's the source code this course!
