[ad_1]
I’m trying to block crawler bots from viewing my site but I still want the search engines to crawl it.
I’ve looked around and found the plugin bad bot blackhole but it caused the websites I did put it on to give me a 403 error every time I tried to edit anything (this kept happening after I removed the plugin as well).
Does anyone have any suggestions on something I can use that can do this job for me?
[ad_2]
Cloudflare > WAF rules, setup a rule to block based on the header or use their “bad bots” rule.
You can edit robots.txt and disallow the crawlers you don’t want