Adding robots.txt rules for certain bots

[ad_1]

Hi all,

I’m trying to add certain rules to my robots.txt file with the filter references on the TSF website.

This filter:

// This is a WordPress Core filter. add_filter( 'robots_txt', function( $robots ) { $robots .= "\r\nDisallow: /my-custom-folder/\r\n"; return $robots; }, 11 );

I need to add a specific allow/disallow for a different user agent for a SEO tool that I use to improve my website. Is this possible with this filter?

I’m thinking along these lines:

User-agent: SEO Bot
Allow: /shop/
Allow: /blog/
Allow: /help/
Disallow: /

Would something like this be possible? I have certain URL’s I don’t want the bot to crawl, because I don’t need them in my SEO tool.

 

This site will teach you how to build a WordPress website for beginners. We will cover everything from installing WordPress to adding pages, posts, and images to your site. You will learn how to customize your site with themes and plugins, as well as how to market your site online.

Buy WordPress Transfer