Be part of JetBrains PHPverse 2026 on June 9 – a free online event bringing PHP devs worldwide together.

Jonjie's avatar
Level 12

How do you protect your site from web scraping?

Just thinking how do you prevent web scraping to your laravel site?

0 likes
6 replies
automica's avatar

@jonjie its a bit of a battle as you want some bots to index your site, eg Googlebot. but on the otherhand you don't want other bots from indexing and scraping content.

one of the best ways to is to make content accessible for users only, even if they get a free account.

if you are dealing with non logged in users, then you should add disallows within your robots.txt file. If they play nicely bots will ignore this content.

https://moz.com/learn/seo/robotstxt

sr57's avatar

and also, convert pages to images, have random html tags and tags' parameters, ban ip if ..., ban type of browsers, ..... but as automica wrote it's real battle and you never win, it always possible to do a screen copy and ...

or totally other way, try win/win approach, let know that your data are easily available if for instance 'user' reference your site, ...

martbock's avatar

To be honest, you cannot really prevent web scraping on your site. If it is publicly available, it is scrapable. If the scraper has sufficient motivation, they will be able to do it.

2 likes
Snapey's avatar

you can make it more difficult by not having incrementing integers for record ids

1 like
Jonjie's avatar
Level 12

@snapey do you have more explanation why or how does it make more difficult? I think that's the best solution for today; just make it difficult but just need more explanation

automica's avatar

@jonjie I would guess having incremental ids makes it easier for a bot to guess your url structure and step through your pages. if you have non-increment ids or use slugs, then its harder work as it will involve a spider.

Please or to participate in this conversation.