Having blogger custom robots.txt file in settings most important for SEO. Because it increases your rankings, helps search engine bot to find perfectly. In fact, they can crawl regularly. But if your blog doesn’t have any customize or SEO friendly robots file, you can’t think rankings. Because search engine crawling bot doesn’t read your whole blog, they just look for great robots txt file with optimized blog sitemap.

There have lot’s of question from newbie bloggers that how to add custom robots.txt in Blogspot? Well, there was the right question and they must need an absolute solution also.

But what are the actual benefits of having this important file?

– There have lot’s of benefits if you have this exclusive modified file. You can indicate robot to follow links, ignore blocked links also. You can keep nice blogger sitemap here. That helps every search engine to find the whole blog together.

You can give no follow attribute to specific pages or posts. But if you don’t set up properly or make mistakes, it harms for blog SEO health. So, be careful and just follow my tips and tricks for better result.

What is Robots.txt?

In every blog, there has a default robots.txt file with some simple codes. That can make your blog visible and crawl-able to Google, Bing, Yandex, etc.

Before starting search engine crawling your blog, they get help from your robot file. So, you just need to keep the proper settings here. Also, do not make any mistakes also.

If you are using Blogspot free blog, then you can see following default robots file in settings:

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://yourblog.blogspot.com/feeds/posts/default?orderby=UPDATED

Now I will discuss what they are mean!

In this robots file, you see that 5 lines with some indicate. These are the main key for custom robots. You can use these by default. But for better SEO facilities, you just need to customize here. But it’s not important.

User-agent: Media partners-Google

If you are using Google Adsense to monetize a blog, then you need to use this code. Because it helps Google bot to serve cache properly and show relevant ads on your blog.

Disallow: /search

When anyone searches on your blog, they will visit via this search page. If you don’t disallow from robots, it sends random generated links to search engine. Which makes duplicate pages.

Note that, here is Allow: / means allowing your homepage to Google, Bing for better updates.

User-agent: *

All of the search engine using this (*) marks as their crawling signal. Just use this simple code and allow all search engine to crawl your website. If you did not add it, just do now & allow all search engines.

Sitemap:

This is the Blogspot sitemap and allow search engines to find your content one place. Without this important part, your SEO is valueless. You must use this code perfectly. These can bring the largest success for your blog.

If you using this sitemap, it can be crawl-able maximum of 25 posts. But if you have more than 25 posts in your blog, you must use this sitemap: “https://yourblog.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500”

How to add blogger custom robots.txt?

We discuss the file, how to modify and how to use. Now it’s time to submit on blogger. Because, without this submission, your work is valueless. So just follow these steps add robots file in Blogspot:
1. Go to your blog from dashboard
2. Now go Settings > Search preferences > Custom robots.txt > YES
3. Now paste your robot’s code in the box
4. Finally, click on Save changes
5. You’re done!

How to check robots file: https://your-blog.blogspot.com/robots.txt

Final Thoughts

If you want to make your blog search engine visible, you must keep a great robots file. Above these tips help you to put blogger custom robots.txt file for better SEO. Just follow these step by step guide and boost your SEO rankings. Let’s Cheer!