Robots.txt files or Robots exclusion protocol is a file which gives web robots instructions about your blog. It tells web crawlers how to crawl and what pages of your blog should be indexed on search engines.
They are automatic crawls so before they visit your blog, it is the robots.txt file that tells them if they are allowed to crawl into a particular page on your blog or not. For wordpress, smf, joomla and other cms platforms, we normally use custom robots.txt files but blogger blogs it is normally disabled and the default is been used.
Today I will be showing you how to use custom robots.txt file for better search engine optimisation and most times it helps give you that desired traffic.
How To Enable Custom Robots.txt in Blogger blogs
- Log in to your dashboard
- Select the blog you want to add the custom robots.txt to
- click on the drop down menu and select settings
- Click on Search Preferences
- Just beside custom robots.txt click on Edit
- Click on Yes and a box will appear
- Paste the Below codes in the box
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://yourblogurl.blogspot.com/feeds/posts/default?orderby=update
- Then Click on Save Changes to save the code.
If you have any questions about this please leave a comment and let me see how i can help solve the problem.
Robots.txt files or Robots exclusion protocol is a file which gives web robots instructions about your blog. It tells web crawlers how to crawl and what pages of your blog should be indexed on search engines.
They are automatic crawls so before they visit your blog, it is the robots.txt file that tells them if they are allowed to crawl into a particular page on your blog or not. For wordpress, smf, joomla and other cms platforms, we normally use custom robots.txt files but blogger blogs it is normally disabled and the default is been used.
Today I will be showing you how to use custom robots.txt file for better search engine optimisation and most times it helps give you that desired traffic.
How To Enable Custom Robots.txt in Blogger blogs
- Log in to your dashboard
- Select the blog you want to add the custom robots.txt to
- click on the drop down menu and select settings
- Click on Search Preferences
- Just beside custom robots.txt click on Edit
- Click on Yes and a box will appear
- Paste the Below codes in the box
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://yourblogurl.blogspot.com/feeds/posts/default?orderby=update
- Then Click on Save Changes to save the code.
If you have any questions about this please leave a comment and let me see how i can help solve the problem.


Now added, it increase my google post crawling.. Happy for this. Cool blog..
ReplyDeleteNice tutorial i was having problem with my robot.txt before
ReplyDeleteJust too cool
ReplyDelete