How To Optimize WordPress Robots.txt File For SEO

Optimize Robots.txt For WordPress: Have you ever heard about Robots.txt? Certainly you had. WordPress robots.txt file is vital for site’s SEO performance. It greatly affects search engine ranking of a website. It helps search engines know which part they should index and which they shouldn’t. Not having robot.txt file doesn't stop search engines from crawling and indexing your site but sometimes you need to block search engines bots to crawl and index specific part of your blog, here robots.txt file plays its part. And if the robots.txt file is wrongly configured then search engine will ignore your entire site and your site will completely disappear from search engines. Though you should optimize robots.txt but you should never misconfigure it so that it shouldn’t block bots to access the important parts of your blog.

optimize wordpress robots.txt for seo

You should have understood why robots.txt is important for any blog/website including WordPress blog. Now you must be wondering What robots.txt file actually is? How can you create robots.txt in WordPress Blog? Why robots.txt is important for SEO? this article will cover it all. Also, in this article, you will learn how to optimize WordPress robots.txt for SEO.

What Is Robots.txt File?

Robots,txt tell search engine bots which part to index and which part it must avoid. At the time of indexing, your website search bots or search spider follows robots.txt file instructions. Depending on its directions search engine or spider will index or no index any page of your website.

How To Create Robots.txt File?

Robots.txt file is located in root folder of your website. You can either use cPanel or FTP client to view this file. If you are using WordPress then robots.txt will be already created which you can see in the root of your WordPress installation.

If your website doesn’t have any robot.txt file then just create a simple notepad file and name it as robots.txt and upload it to your site’s root folder using FTP client or cPanel.

You can also check BroughtMeUp’s Robots. txt file by visiting link below:

How To Use Robots.txt File?

The first line in robots.txt usually names a user agent. The user agent is the name of search bot that is trying to crawl your site. Examples of user agent would be Googlebot , Bingbot etc. You can restrict which user agent should crawl your website.

But if you are looking for more search engine traffic then I would recommend you to allow every search bot. To instruct all bots to index your site simply write User-agent: *

Now next lines in robots.txt follow with Allow or Disallow commands for search engine bots to let them know which website part/page you want the search engine to index or no index. Sample robots.txt file would be:

User-Agent: *

Allow: /wp-content/uploads/

Disallow: /wp-content/plugins/

How To Optimize Robots.txt File For SEO?

To optimize robots.txt:

  • Don’t use robots.txt to hide low-quality content, stop indexing of category, date rather use other WordPress plugins like WordPress SEO Plugin By Yoast and Robots Meta to add Nofollow and Noindex meta tags. The reason being that Robots.txt doesn’t stop search engine bots from crawling your site, it can only restrict its indexing.
  • Consider disallow readme.html. Readme.html file can be used by someone to know which WordPress version you are using by browsing to it thus they will be able to hack your website. By disallowing it you are safe from these attacks. Simply write Disallow: /readme.html in robots.txt file.
  • You should also disallow WordPress plugin directory for security reasons. Simply write Disallow: /wp-content/plugins/
  • Disallow replytocom link to avoid many post duplication issues. Simple write Disallow: *?replytocom in your site’s robots.txt file.
  • Consider adding your site’s XML sitemaps to robots.txt for better and faster indexing of your blog posts.

Read: WordPress Ping List For Faster Indexing Of Your Blog Post 

Optimized Robots.txt For WordPress:

User-agent:  *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /comments/feed/
Disallow: /trackback/
Disallow: /index.php
Disallow: /xmlrpc.php
Disallow: /wp-content/plugins/

User-agent: NinjaBot
Allow: /

User-agent: Mediapartners-Google*
Allow: /

User-agent: Googlebot-Image
Allow: /wp-content/uploads/

User-agent: Adsbot-Google
Allow: /
User-agent: Googlebot-Mobile
Allow: /


This is optimized robots.txt for WordPress. You just need to replace BroughtMeUp sitemap link with your actual sitemap link.

Is My Content Affected By New Robots.txt File?

You can use Fetch As Google feature in Google webmaster tools to check your affected content due to this new robots.txt file. Simply log in to your GWT account and go to Crawl > Fetch as Google, add your blog post URL and check is there any issue accessing your blog post.

Check Robots.txt errors using Fetch as Google

Alternatively, you can also use crawl error feature in GWT to know if new robots.txt

file has caused some crawl errors. Simply go to Crawl > Robots.txt Tester and add the blog post or page URL, then select the desired bot and click on Test and you will find if it is denied by the new robots.txt file.

robots.txt error tester in GWT

That's it we are completed with optimizing robots.txt file for SEO. Hope this guide has helped you in understanding various aspects of Robots.txt. I hope you are using optimized robots.txt file for your website. Lastly, if you enjoyed the article please share it across the internet. For any feedback and query do comment and also do subscribe for updates.


Add a Comment

Your email address will not be published. Required fields are marked *