Remove robots.txt from your project and it add it to .gitignore. Then access your server for each environment and add it its robots.txt version there.
Sep 9, 2023
5
Level 1
How to have separate robots.txt files for staging and production sites on Laravel Forge?
I need to have different robots.txt files for my staging and production sites that are on the same Forge server. Here's what I'd like to do:
- in my laravel source code public folder, I'd like to have a staging.robots.txt and a production.robots.txt.
- in the forge deployment script, I'd like to check whether we're in staging or production environment, then include the correct robots.txt file onto the server. E.g. If we're in staging, include staging.robots.txt, rename it to robots.txt, and ignore the production.robots.txt.
I have little experience with bash/deployment scripts, so would really appreciate some help with this. Thanks.
Level 9
Why not simply use your web routes and depend on some env variables
for example in your public folder remove robots.txt file, and create two new files robots-staging.txt and robots-live.txt , then use this code in your route file or something similar :
Route::get('robots.txt', function () {
$robotsFile = public_path('robots-staging.txt');
// use whatever logic, you may create new config and env variables for this part
if (config('app.env') == 'production') {
$robotsFile = public_path('robots-live.txt');
}
return response(
file_get_contents(
$robotsFile
), 200)
->header('Content-Type', 'text/plain');
});
3 likes
Please or to participate in this conversation.