Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

doabledanny's avatar

How to have separate robots.txt files for staging and production sites on Laravel Forge?

I need to have different robots.txt files for my staging and production sites that are on the same Forge server. Here's what I'd like to do:

  • in my laravel source code public folder, I'd like to have a staging.robots.txt and a production.robots.txt.
  • in the forge deployment script, I'd like to check whether we're in staging or production environment, then include the correct robots.txt file onto the server. E.g. If we're in staging, include staging.robots.txt, rename it to robots.txt, and ignore the production.robots.txt.

I have little experience with bash/deployment scripts, so would really appreciate some help with this. Thanks.

0 likes
5 replies
MohamedTammam's avatar

Remove robots.txt from your project and it add it to .gitignore. Then access your server for each environment and add it its robots.txt version there.

1 like
karamqubsi's avatar
Level 9

Why not simply use your web routes and depend on some env variables for example in your public folder remove robots.txt file, and create two new files robots-staging.txt and robots-live.txt , then use this code in your route file or something similar :

Route::get('robots.txt', function () {

    $robotsFile = public_path('robots-staging.txt');
    // use whatever logic, you may create new config and env variables for this part
    if (config('app.env') == 'production') {
        $robotsFile = public_path('robots-live.txt');
    }

    return response(
        file_get_contents(
            $robotsFile
        ), 200)
        ->header('Content-Type', 'text/plain');
});
3 likes
TECreasey's avatar

@karamqubsi I have always wondered about a way to do this more automatic and I don't know why I hadn't thought of this before.

Brilliant.

Please or to participate in this conversation.