Developer654079525 liked a comment+100 XP
3w ago
Definitely don't duplicate the code in both controllers keep it DRY.
Since it involves fetching data, the logic belongs in the Model. Using a Local Scope is the most idiomatic Laravel way to handle this. However, if it’s a more specialized fetch that doesn’t fit a scope, a static method on the model is a solid choice. Just keep the database logic out of the controllers.
Developer654079525 liked a comment+100 XP
3w ago
Developer654079525 liked a comment+100 XP
3w ago
Moving to a model seems more appropriate since this helper interacts with database. Utilize local scope: https://laravel.com/docs/13.x/eloquent#local-scopes
Developer654079525 started a new conversation+100 XP
3w ago
Where should i keep the helper fn that fetches some data from a particular db table and is then used extensively across only two controller classes? Should i keep it as a private member function in both controllers and then invoke it in other member functions as:
$result = this->myFn();
Or should I move it to a model, mark it as static and then in my controllers invoke it as:
$result = SomeModel::myFn();
Developer654079525 liked a comment+100 XP
1mo ago
In this case you can write a backed enum.
namespace App\Enumerations;
enum Opcode: string
{
case Good = 'good';
case Bad = 'bad';
}
https://www.php.net/manual/en/language.enumerations.backed.php
Developer654079525 wrote a reply+100 XP
1mo ago
Developer654079525 started a new conversation+100 XP
1mo ago
Developer654079525 liked a comment+100 XP
1mo ago
Developer654079525 liked a comment+100 XP
1mo ago
@developer654079525 It really depends on how that raw HTML is being authored in the first place.
Ideally, yes, you want to remove any hard-coded references to dynamic content. If a user is linking to a page controlled by the CMS, then it’s probably better to store some sort of identifier rather than the URI at that time. You will then need to decide what you do with links to pages that are then deleted entirely, though.
Developer654079525 wrote a reply+100 XP
1mo ago
Developer654079525 liked a comment+100 XP
1mo ago
You can effectively save all links in the database and inside the texts, you can call a link like {:link:} and replace with the link.
That means if you need to update some links, you don't need to update the texts, but only the links. Anyway you will need to update something.
But I think that it's quite better to update the texts.
Developer654079525 started a new conversation+100 XP
1mo ago
If a CMS stores raw HTML, and some anchor links, or their parts (not many but a few of them) might change in the future, what is the preferred approach to store such links? Use some made up syntax/templating like {:something1}, {:something2} and then regex replace these in the future?
Currently I keep the hard coded full URLs and replace parts where they change in the future.
Developer654079525 started a new conversation+100 XP
1mo ago
Currently, we have several VMs in which we are developing apps. Haven't had the need for a staging segment yet. If a small shop is developing an app, do they really need a staging segment on their or client's website? If so, would it then become staging.example.com or example.com/staging? I am not sure on the mechanics so would kindly ask for some guidelines.
For example, how and if to hide this from search engines and similar.
Developer654079525 liked a comment+100 XP
1mo ago
Is it the same webhosting ?
Shared webhosting are sometimes different.
I also have a shared webhosting, but I don't need to put the public directory of the Laravel project inside a specific folder. So no need to change anything with the structure of the project, I just have to point the domain name to the public folder of the Laravel project and it's done.
So I think that it depends what you have to do to get it working on your webhosting.
Can you give more informations please ?
Developer654079525 liked a comment+100 XP
2mos ago
Developer654079525 started a new conversation+100 XP
2mos ago
Developer654079525 liked a comment+100 XP
2mos ago
I have a scenario with a 'warm' standby server located on a VPS in a different datacentre. It does not need to be synced in real time as all data is transient, but it does need to be daily synced.
A console command on the Live system uses Spatie DBDumper to create a dump of certain tables into a file on an S3 bucket. Both systems have access to the same bucket. At a time after the dump was created, the standby system pulls the file from S3 and applies it to its copy.
I've pasted the two commands below. They are pretty simple to understand.
DumpDB.php
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\Storage;
use League\Flysystem\MountManager;
use Spatie\DbDumper\Databases\MySql;
class DumpDb extends Command
{
protected $signature = 'app:dumpdb';
protected $description = 'Creates a copy of the database so that it can be picked up by the standby';
public function __construct()
{
parent::__construct();
}
public function handle()
{
$filename= 'latest-db.sql';
MySql::create()
->setDbName(config('database.connections.mysql.database'))
->setUserName(config('database.connections.mysql.username'))
->setPassword(config('database.connections.mysql.password'))
->setDumpBinaryPath(config('database.connections.mysql.dump.dump_binary_path'))
->includeTables(['system_user','systems','users','lookups','features','consignments', 'postcodes', 'timeanddistance', 'reports', 'sites'])
->dumpToFile(storage_path('app/db/' . $filename));
$mountManager = new MountManager([
's3' => Storage::disk('s3')->getDriver(),
'local' => Storage::disk('local')->getDriver(),
]);
Storage::disk('s3')->delete($filename);
$mountManager->copy('local://db/'.$filename, 's3://'.$filename);
return 0;
}
}
PullDb.php
<?php
namespace App\Console\Commands;
use App\System;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\DB;
use Illuminate\Support\Facades\File;
use Illuminate\Support\Facades\Log;
use Illuminate\Support\Facades\Storage;
use League\Flysystem\MountManager;
class PullDb extends Command
{
protected $signature = 'app:pulldb';
protected $description = 'fetches database copy from S3 and inserts into local database';
public function __construct()
{
parent::__construct();
}
public function handle()
{
$this->info('Getting the backup.');
$filename= 'latest-db.sql';
//get copy of the database from server
$mountManager = new MountManager([
's3' => Storage::disk('s3')->getDriver(),
'local' => Storage::disk('local')->getDriver(),
]);
if(File::exists(storage_path('app/db/' . $filename))) {
File::delete(storage_path('app/db/' . $filename));
}
$mountManager->copy('s3://' . $filename, 'local://db/' . $filename);
//insert db into local database
$this->info('Importing...');
//run the sql script to import the database tables.
try {
DB::beginTransaction();
DB::unprepared(File::get(storage_path('app/db/' . $filename)));
DB::commit();
} catch (\Throwable $th) {
DB::rollBack();
Log::error('Importing the production database from the copied file failed');
}
Log::info('successfully imported partial database from the live system');
return 0;
}
}
Developer654079525 liked a comment+100 XP
2mos ago
Developer654079525 liked a comment+100 XP
2mos ago
Developer654079525 liked a comment+100 XP
2mos ago
Development and production databases should be completely separate. There should be no data transfer between them.
Schema changes are done through migrations, like @snapey said. You run migrations whenever you deploy changes to production.
How would "syncing" even work when there's production data? Let's say someone registers on your site. If you import a MySQL dump from development, that user account will vanish.
Developer654079525 liked a comment+100 XP
2mos ago
Developer654079525 wrote a reply+100 XP
2mos ago
Developer654079525 wrote a reply+100 XP
2mos ago
I see, I get a clear picture now. It seems like the current workflow can/should be improved. Currently, we do not have user registration yet, and are using SQL dumps on a dev machine, the node machine, and manually importing them on live server.
But, if we were to sync in one direction only, what approach would you recommend? The mysqldump route?
Developer654079525 wrote a reply+100 XP
2mos ago
Developer654079525 started a new conversation+100 XP
2mos ago
What is the prefered options of syncing a local MySQL db with the production one that sits on a shared hosting? We are using GIT, ssh and Composer to sync the framework, but the MySQL sync is a bit of a pain. Currently, we are using PHPMyAdmin to manually export on the dev machine and drop and import on a live hosting account. How can this be automated so that we don't have that brief downtime while syncing. Both the dev and the hosting machines are running Linux.
The shared hosting allows remote MySQL connections and has all the bash goodies as well. Is mysqldump in combination with scp and ssh the way to go?
Developer654079525 liked a comment+100 XP
2mos ago
Developer654079525 wrote a reply+100 XP
2mos ago
Developer654079525 liked a comment+100 XP
2mos ago
You could create a middleware and apply it to the route. Conditional clauses don't work when routes are cached, so it's best to keep the route files simple.
class LocalOnly {
public function handle(Request $request, Closure $next) {
if (!app()->environment('local'))
abort(404);
return $next($request);
}
}
...
Route::get('myroute')->middleware(LocalOnly::class);
Developer654079525 wrote a reply+100 XP
2mos ago
Developer654079525 started a new conversation+100 XP
2mos ago
Developer654079525 started a new conversation+100 XP
2mos ago
Latest framework update made some security and workflow changes to the PsySH library used in Tinker. Now, the first time we start the Tinker we are greeted with the confirmation dialog. This looks like a major shift from a usage perspective. I understand they are tackling the security issue, but not sure if the library is the one that should be displaying the dialog. What are your thoughts on this?
Developer654079525 wrote a reply+100 XP
2mos ago
We are displaying those elements in a page and expanding them in the print version via CSS. Works out of the box with Bootstrap. Unfortunately, looks like we are stuck with raw HTML for now. To the raw HTML's defense, looks like the markdown approach isn't without its set of limitations either. Think quick edits and nasty new lines conundrums when using 3rd party software to quickly edit the record. Custom class styling limitations and nasty parser outputs too. At this point, it looks like we must continue with the raw html and endure the wyswyg editing pains.
Developer654079525 wrote a reply+100 XP
2mos ago
Developer654079525 started a new conversation+100 XP
2mos ago
Our cms stores raw HTML in a table. The HTML was written by us and is displayed unescaped as part of a page in multiple routes. The content itself usually contains simple tags, escaped pre code and code content and some Bootstrap styled divs and tags. Some of the divs even utilize the Bootstrap's JavaScript functionality.
A markdown approach seems to be an easier option for editing, although less customizable. What should we do now? Continue with the raw HTML or make an effort to somehow convert that to markdown and continue with a markdown.
The content is around 80-100 records long so far, and the records that contain Bootstrap's JS functionality are around 20 in total.
We are using 3rd party libs on a dedicated machine to export the content to PDF too and that works well.
The speed of writing and editing the articles stored as such is a bit of a pain point for us. There is a decision to be made whether to somehow convert/rewrite the existing HTML to markdown or continue with the HTML. What tools/approach would you suggest?
Developer654079525 liked a comment+100 XP
4mos ago
Developer654079525 liked a comment+100 XP
4mos ago
It provides type safety, which helps you catch bugs. It's also a hint for your IDE, allowing auto completion. So yes, definitely.
I always use types when possible, because there's no good reason not to. If you want to allow any type, you can use mixed. You still can't type-hint resource (e.g., a file handle), because it's not an actual type in the Zend Engine.
Developer654079525 liked a comment+100 XP
4mos ago
Developer654079525 liked a comment+100 XP
4mos ago
Yes, I would advice making a habit out of it. It makes your code robust, self documenting and makes your life easier during debugging and coding (autocomplete).
No need for docblocks if you have good attribute names with typehinting and typehinted the return types. Helps making your code easy to read.
Developer654079525 started a new conversation+100 XP
4mos ago
Developer654079525 liked a comment+100 XP
4mos ago
I outsource pdf creation to third parties. Via api, You send them the html referencing your external stylesheets, they render the pdf and return it.
I have had good results with Neutrino API https://www.neutrinoapi.com/api/html-render/ but other similar services are available.
Developer654079525 liked a comment+100 XP
4mos ago
HTML with CSS to PDF never works well. Good luck if you rely on JS views.
The best for me has been as mentioned above the barryvdh/laravel-dompdf. I use Blade views, ~/resources/views/print/pdf/.... Minimal CSS, overridden from the HTML pages, pretend your in the 90s writing HTML email. You cannot do fancy things, as the DOMPDF library does not support all CSS.
I've been doing HTML to PDF for just over 15 years.
Other options are using libreoffice headless. Best is to design your PDF documents, then dynamically fill them.
Best way to test is using your web browsers Print Preview -> Print to PDF. From what I remember, Firefox was the best.
Developer654079525 started a new conversation+100 XP
4mos ago
We want to export parts of our website into a downloadable, self-contained PDF file with the TOC. It will be generated from a separate, hidden layer of our app that has custom CSS. So, it would be nice, if the rendered PDF somewhat closely matches the CSS HTML look. Which library that plays nicely with Laravel 12.* would you recommend (for a commercial use)?
Developer654079525 liked a comment+100 XP
4mos ago
What happens if I accidentally run composer update on a remote machine?
It updates packages to the latest versions allowed by your constraints, and generates a composer.lock file that differs from the one in version control. Your production environment may then be running package versions that you haven't tested.
In principle, that shouldn't cause issues if the version constraints in composer.json are sensible, meaning they don't allow updates between major versions (e.g. 5.x.x to 6.x.x). Only major versions are supposed to include backward-incompatible changes. But that's the theory. In practice you should always test the code you're about to deploy.
So, in a nutshell, is it composer update on a local machine, then push and composer install on a production machine?
That's the basic idea. If you used a CI/CD pipeline, the packages would be installed in the build environment and then pushed to the production server(s). In a simpler setup, you would run composer install as part of your deployment process on the production server.
Developer654079525 wrote a reply+100 XP
4mos ago
Developer654079525 liked a comment+100 XP
4mos ago
Developer654079525 wrote a reply+100 XP
4mos ago
Developer654079525 liked a comment+100 XP
4mos ago
composer audit is a useful tool. It shows known vulnerabilities in the installed package versions. You should update all packages from time to time, but fixing vulnerabilities is the first priority.
Just to be clear: you shouldn't run composer update in production. You should run it in the development environment and test that everything works. That generates a composer.lock file in the project root, which tells Composer the exact package versions to install. You deploy that file along with everything else and then run composer install to install the packages.
Developer654079525 started a new conversation+100 XP
4mos ago
Developer654079525 liked a comment+100 XP
5mos ago
Developer654079525 liked a comment+100 XP
5mos ago