Hi all,
I’m trying to add certain rules to my robots.txt file with the filter references on the TSF website.
This filter:
// This is a WordPress Core filter. add_filter( 'robots_txt', function( $robots ) { $robots .= "\r\nDisallow: /my-custom-folder/\r\n"; return $robots; }, 11 );
I need to add a specific allow/disallow for a different user agent for a SEO tool that I use to improve my website. Is this possible with this filter?
I’m thinking along these lines:
User-agent: SEO Bot
Allow: /shop/
Allow: /blog/
Allow: /help/
Disallow: /Would something like this be possible? I have certain URL’s I don’t want the bot to crawl, because I don’t need them in my SEO tool.
