How to fix indexed, though blocked by Robot.txt

Are you demotivated by a drop in search rankings on your website? If Google does not keep crawling through your pages, these dropped rankings may gradually allow you to get fewer visits and, subsequently, few conversions.

The “Indexed, though blocked by robots.txt” error means search engines stumble over your site’s crawling. When this occurred, Google knew and crawled the site, but the main page was non-existent. You can edit the robots.txt file to show which pages search engines should and should not see.

For better understanding, we’ll explain how to fix the “Indexed, though blocked by robots.txt” error and demonstrate two different methods. Let’s get started!

What does indexing have to do with robots.txt?

This article will not specifically delve into the “Indexed though blocked by robots.txt” error. Still, it is important to understand the connection between indexing and the robots.txt file before going further.

During their crawling of your website, search engine robots, such as Googlebot, look for a file named robots.txt. In this file, the bots read instructions that include URLs or directories that they can or cannot visit and carry out an indexing process. Now, this is completely different from Discovered Currently, Not Indexed error.

Nevertheless, it should be noted that the robots.txt file doesn’t master indexing. However, HTML language gives directions on which pages to crawl. When a page is blocked in the robots.txt file, and there are links from other websites, search engines may still decide to index it without considering its content causes “Indexed, though blocked by robots.txt” in Google Search Console.

Let’s move to the underlying causes of the “Indexed, though blocked by robots.txt” error. Such a mistake implies a situation when Google still indexes a page on your website, although the robots.txt file blocks it. In other words, Google is aware of the page and has also put it in the index, but the content hasn’t been crawled due to the restrictions described in the robots.txt file.

Another issue is that when a page is disallowed in the robots.txt file, but some external links still point to that page, the crawlers often crawl the disallowed page. Google finds the page through these links and uses link context and relevance as the basis for indexing the page, though it is still struggling to index the page because of unreadable content by its robots.

Why is “Indexed, though blocked by robots.txt” bad for SEO?

You could be thinking, “if Google has my page in index, isn’t that a good thing?”. When a page is iIndexed, though blocked by robots.txt, it can negatively impact your SEO efforts in two main ways:

  1. Poor User Experience: If users click on a page where the Robots.txt meta tag has been barred, they will encounter a problem such as an error message or a blank screen. Hence, people may encounter a negative experience, which might lead to their quick return to search results to see the other pages. Consequently, Google would see that your page doesn’t meet the searcher’s demands.
  2. Incomplete Information: Since Google has not yet crawled the page contents, it relies on external signs like links and keywords to perceive its theme and meaning. In this instance, the search engine will be unable to understand what your page is all about. This might impact your ranking or cause your page to appear irrelevant.

Different Ways to Fix Indexed, Though Blocked by Robot.txt

Now that you have a clear picture of this issue, we will determine the effective ways to solve this problem and have your pages indexed by the search engines.

Method 1: Edit robots.txt Directly

The most straightforward solution to this “Indexed, though blocked by robots.txt” issue is to modify your robots.txt file directly. Here’s how you can do it:

  1. Connect to your website’s root folder by using the FTP client or the file manager of your website hosting provider.
  2. Find the robots.txt file, then open it with a text editor.
  3. Find the line where the detail is listed for the exact page or directory that returns the error. It typically looks like “‘Disallow: /page-URL/.'”
  4. Modify the disallow rule or remove it to let the crawler access the page or directory that was previously blocked.
  5. Save and upload the updated robots.txt file back to the root directory of your website.

Removing the “disallow” directive allows search bots to crawl and index the page without problems.

Method 2: Use an SEO Plugin

Using a content management system (CMS) like WordPress, you can use SEO plugins to simplify editing your robots.txt file. Main plugins like Yoast SEO and All in One SEO Pack have user-friendly interfaces for modifying the robots.txt file without dealing with the raw code.

Here’s how you can use an SEO plugin to fix the “Indexed, though blocked by robots.txt” error:

  1. Install and configure the SEO plugin of your choice.
  2. Find the “File Editor,” where you will locate the robots.txt tab.
  3. Figure out the disallow instructions that are causing the exception and remove or correct it if required.
  4. Finally, remember to press Save, and the plugin will refresh your robots.txt file automatically.

Method 3: Rankmath

Rank Math is another powerful WordPress SEO plugin that offers users with an easy solution to modify their robot.txt file. Here’s how you can use Rank Math to resolve the “Indexed, though blocked by robots.txt” error:

  1. Put the Rank Math plugin into action, install it on your site, and run it on WordPress.
  2. Navigate to the “General” tab under the Rank Math dashboard and click the “General Settings” tab.
  3. Please continue to the part of the page called “Edit robots.txt” and hit the button with the “Edit” sign.
  4. Find the appointment directives that give rise to the error and then delete or modify them as appropriate.
  5. Finally, click the “Save Changes” option to save the changes you made in the robots.txt file.


Getting over the “Indexed, though blocked by robots.txt” error may look intricate at first sight. However, once you have all the necessary instruments and knowledge, you can easily deal with them and unlock your website for visitors. A negative influence of the robot’s custom action blocking web command pages on the website’s SEO and organic traffic can appear. After taking the given steps, you can find and implant the”blocked by robots.txt” issue to prevent the most important pages from appearing as not found and inaccessible to your users.

If you found this guide helpful, check out more SEO tips and strategies on AlgoSaga agency portal. Online success is possible when you work with a team of professionals committed to making your website rank equally well with competitors. Hence, we will help you improve your online presence and achieve the optimism of your targeted goals. Keep your eyes open for the next articles on this or other useful SEO tools to up your game with SEO!

Leave a Reply

Your email address will not be published. Required fields are marked *