How to Fix 'Indexed, though Blocked by robots.txt' in Blogger
If you are running your site on Google’s Blogger or WordPress platforms, then you might have faced this issue “Indexed, though Blocked by robots.txt” in Google Search Console when you tried to index your pages. Before looking for a solution, first of all, you should need to know what is "Indexed, though blocked by robots.txt".
What is "Indexed, though blocked by robots.txt"?
This means search engine crawlers have discovered and
processed the page's content, but they are not displaying the page in search
engine results because the website's robots.txt file instructs the search engine
not to do so.
You can see how this issue is displayed in GoogleSearch Console in the image below.
In this article, I will guide you on how to resolve the "Indexed, though Blocked by robots.txt" issue in Blogger.
How to Fix "Indexed, though Blocked by robots.txt" in Blogger:
1. Open Blogger and select the blog from the top-left
side.
2. Click on "Settings."
3. Scroll down to the "Crawlers and
indexing" section. In this section, you will find options related to your
blog's robots.txt and a few other settings.
4. Firstly, enable "Custom robots.txt."
5. Click on "Custom robots.txt." In this
field, specify which users to allow and disallow and enter the sitemap.
6. You can generate a sitemap and custom robots.txt
using the Blogger Sitemap Generator.
7. Visit the Blogger Sitemap Generator, enter your
website address, and click on "Generate."
8. Now, copy the generated text and paste it into the
"Custom robots.txt" field.
9. You need to add a few more lines to the generated text. Copy the text below and place it in your custom robots.txt field. It should now look like this.
# Sitemap built with https://www.labnol.org/blogger/sitemap
User-agent: *
Disallow: /search
Disallow: /category/
Disallow: /tag/
Allow: /
Sitemap: https://www.example.com/atom.xml?redirect=false&start-index=1&max-results=500
10. After that, enable "Custom robots header tags."
11. Click on "Home page tags" and enable
"all" and "noodp," then click on "Save."
12. Click on "Archive and search page tags"
and enable "noindex" and "noodp," then click on
"Save."
13. Click on "Post and page tags" and enable "all" and "noodp," then click on "Save.
After making these changes, navigate to Google Search
Console and click on "Validate Fix."
That's it.
Within 20-25 days, the "Indexed, though Blocked
by robots.txt" issue will be resolved.
No comments