Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

chrisgrim's avatar

Robots.txt file not blocking indexing

Hi All, I have a Laravel project live on the web but I want to block search engines from grabbing it. I read online to do this you create a robots.txt file with

User-agent: *
Disallow:

and put this in your folder. I put a copy of this file in my app root folder and a copy in the public folder. However, when I check on the google search console it says URL is on Google. Am I doing this wrong for a Laravel project?

0 likes
5 replies
Snapey's avatar

public folder only. Your project root should be inaccessible.

1 like

Please or to participate in this conversation.