Google is reporting "Blocked by robots.txt"

Anthony James

Last Update 4 months ago

"Blocked by robots.txt" in the context of Google Search Console refers to a status that indicates that a particular URL on your store is being prevented from being crawled and indexed by search engine bots based on the instructions in the robots.txt file.

You can locate your robots.txt file adding /robots.txt to the end of your domain name. For instance, example.com/robots.txt

The robots.txt file is a text file located in the root directory of your store that tells search engine crawlers which pages or sections of your store should not be accessed or crawled. It is a way to communicate directives to search engine bots regarding which content should be excluded from indexing.

When a URL is flagged as "Blocked by robots.txt," it means that the robots.txt file on your store contains instructions that prevent search engines from crawling and indexing that specific URL. This can be intentional, such as when you want to hide certain pages or directories from search engines.

Shopify generates the default robots.txt file to try and control duplicate content.