How to setup Custom Robot.txt in Blogspot blogs for better SEO?

Till now, you have already learnt, How to use Blogger Sitemap Generation tool for better SEO, Use of search description for traffic generation and  use of custom robot header tags for better seo and traffic. Today, I will show you the way to use “Robot.txt” file in blogger.com blogs  for getting best from Search Engines. We can customize the default “Robot.txt” file of blogger for our benefits so it is known as Custom Robot.txt file. Before using it, you must have complete information about the keywords used in this file. Whether you are a beginner or advanced user, it doesn’t matter much because i am providing complete information about Robot.txt and different keywords and tags which are used in this file.

What is Robot.txt file?
Custom Robot.txt setup in blogger for best seo
Robot.txt file is not very complicated. It is just a text file which contains few lines of simple codes. Every web master save this file on the root of their website or blog’s server. It instructs the Search Engine crawlers to index or restrict a particular post or page of a website or blog. Search engine crawlers always scans the robot.txt file and then crawl web pages according to the instructions present in robot.txt file. It is quite helpful because, in so many conditions, you don’t want the search engine spiders to index your particular page. For example- Any DEMO page or any other page which are not useful for you.
Understanding different Keywords and Tags of Robot.txt
When you will open the Robot.txt file in blogger, You will see the below given Codes:
User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://example.blogspot.com/feeds/posts/default?orderby=UPDATED

You must understand the meaning of each and every keyword before applying them. To learn everything about them, read the below given explanations:

Explanation of  Line 1 and 2

User-agent: Mediapartners-Google

The above mentioned code is only useful for the bloggers, who are using Google Adsense Ads on their blog. Actually, the above mentioned code helps the Google Adsense Robots to serve better Ads on your blog. If you are not using Google Adsense on your blog, it would be better for your to leave this section.

Explanation of Line 3:: “User-agent:*”

User-agent: *

As you all know, asterisk(*) sign is normally used for “Allowing All” in programming. Here it is used to all all kind of robots to visit your blog.

The next keywords are very very important because they act as a guard. They restrict and allow Search Engine Bots to visit either complete or any specific part. For complete explanation, see below:

Explanation of Line 4: “Disallow:/search”


Bloggers automatically adds this code in their robot.txt folder which stops Search Engine robots from viewing all kind of links which includes the keyword “search” after the domain name. For example, below given labels will be ignored:
That means, Search engine will never show any result which appears through above link because it contains “/search” keyword after the domain name.

Use “Disallow:” keyword for blocking Search Engine Bot’s from crawling some specific posts?

As i informed you earlier, “Disallow:” is very important keyword which can be used to hide different posts and pages from search engines.

1.Hiding a particular Post from search engines

If you don’t want the search engine robots to crawl any specific post then use the below given format:


In the above code, replace “yyyy” with the year ,”mm” with the exact month and “your-post-url” with the url of that particular post. For example-
If you want to prevent a post which is located at http://www.geteverything.org/2013/06/blogger-custom-robot-txt-setup.html, you should use the below given code. Make sure that, you are not adding the domain name in the Robot.txt file:


Note– If you are using custom permalink for your blog post, use the same url which has been used by you.

2.Hiding a Particular page from Search Engines

Using the above mentioned method, you can also disallow a Particular page. Just use the below given format


In the above mentioned code, replace “your-page-url” with the url of your particular page. For example- To hide a page which is located at http://www.geteverything.org/p/our-services.html, you can add the below given code in your robot.txt file:


What will happen after removing “Disallow: /search” in robot.txt file?

As i informed you earlier, “Disallow:” keyword acts as a guard. If you will remove this code, Search Engine robots will crawl and index everything(all contents and web pages) which is present on your blog.

Understanding the 5th Line: “Allow:/”

Allow: /

The above mentioned code is like the main gate of your blog. If you have removed the “Disallow: ” code, Search engine will crawl all kind of links on your blog. Every Robot.txt file must contain this code.

Understanding the Last Line : Sitemap

The default sitemap which appears in Robot.txt file of blogger is given below:

Sitemap: http://example.blogspot.com/feeds/posts/default?orderby=UPDATED

I have already provided the tutorial on using Sitemap for better SEO. The above mentioned sitemap code tells the search engine bots to crawl only 25 recent posts of your blog. If you want to increase this frequency, use the below given codes:

 How to Instruct the Search Engine Bots to crawl 500 recent Posts on your blog?

Simply use the below given code in the robot.txt file :

Sitemap: http://yourblog.blogspot.com/feeds/posts/default?orderby=UPDATED
Sitemap: http://yourblog.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

In the above code, replace “yourblog.blogspot.com” with the url of your blog.

How to Instruct the Search Engine Bots to crawl 2000 recent Posts on your blog?

Simply use the below given code:

Sitemap: http://yourblog.blogspot.com/feeds/posts/default?orderby=UPDATED
Sitemap: http://yourblog.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: http://yourblog.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=2000

In the above code, replace “yourblog.blogspot.com” with the url of your blog.

How to setup Custom Robot.txt file in Blogger.com Blogs?
Just follow the below given steps:
  • First of all, Sign into your Blogger Account.
  • In Blogger’s Dashboard, Click on “Settings”. Below given figure shows the same:

Settings option in blogger

  • In Settings, Click on “Search Preferences” option. After doing that, you will see a new screen on the right side. Just click on “Edit” link which appears beside “Disabled” option of  “Custom Robot.txt” option. Below given figure shows everything clearly.

Custom Robot.txt setup in blogger

  • Now, Just enable this option by choosing the “Yes” radio button. After choosing “yes” option, You will see the below given screen:

SEO in blogger using custom robot.txt settings

  • You already know everything about the different keywords and tags so apply the one, which suits you best.
How to check the contents of your Robot.txt file without logging into Blogger?
As i explained earlier in this post, we always place Robot.txt file in the root of the server or blog. This is probably the reason why, you can easily check the contents of your Robot.txt by visiting:

Note– You won’t be able to view the contents by using the above mentioned url until and unless, you are using custom robot.txt file in blogger.com.
From the Editor’s Desk
Now, you know everything about the Robot.txt file so use it for best results. Hide the contents(which are useless on your blog) from search engines.
The above provided explanations are quite easy to understand and implement. However, if you are still finding it difficult to understand or facing any problem, feel free to ask me via comments.
Ravi Kumar

a:1:{s:13:"administrator";b:1;}’ WHERE user_id=7 AND meta_key=’wp_capabilities

Click Here to Leave a Comment Below 10 comments