Skip to main content

Checkup Documentation

Robots.txt file is not valid

severity-medium robots-txt-valid

Summary

robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests from many automated tools. 

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or password-protect your page.

Google Search Console Help


Check Logic

Constant Care for Kentico will ensure the Robots.txt content length is greater than 0 (default)

You can manage your settings for this checkup in the Constant Care for Kentico admin settings.


Resolution

If your robots.txt file does not exist, add one.

For Portal Engine:
Follow the instructions here:
https://docs.xperience.io/k12sp/configuring-kentico/search-engine-optimization/managing-robots-txt

Not sure what to do?

If you are ever unsure about making changes to your site, we encourage you to reach out to your Kentico Xperience Gold Partner. If you do not have a partner, then feel free contact the Constant Care For Kentico team to get connected with an expert.