A resource can be restricted from indexing in several ways:
– in the robots.txt file
– by Noindex X-Robots tag
– by Noindex Meta tag.
Each of these is a line of HTML code that says how crawlers should handle specific resources on the site. Specifically, the tag tells crawlers if they are not allowed to index the page or resource, follow its links, and/or archive its contents. So make sure that your unique and useful content is available for indexing.
How to resolve it: #
It is recommended to re-check the robots.txt file, X-Robots tag, or noindex meta tags and make sure that all your useful content gets indexed and is not forbidden by mistake.
To make sure none of the website sections with valuable content is restricted from indexing, check Disallow rules in your robots.txt file.
Option 1: WordPress CMS #
For WordPress CMS You can Use the File Editor option in the SEO plugin and remove or put appropriate restrictions.
Another option is to remove nondex tag by unchecking the checkbox from the WordPress backend general setting tab or at the SEO plugin.
Option 2: Any other CMS #
For any other CMS like Jumbla, Shopify, Magento, etc you can ask your web developer to check the robots.txt file in the root folder where your websites are hosted and remove robots indexing restrictions. If you still need more info, you can contact the RankChutney team.