Skip to main content

Checkup Documentation

Robots.txt path setting

severity-medium cms-settingskey-robotstxt

Summary

robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests from many automated tools. 

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or password-protect your page.

Google Search Console Help


Check Logic

Constant Care for Kentico will ensure the Robots.txt path setting has a value (default)

You can manage your settings for this checkup in the Constant Care for Kentico admin settings.


Resolution

Follow Kentico's instructions for creating your robots.txt file
https://docs.xperience.io/k12sp/configuring-kentico/search-engine-optimization/managing-robots-txt

To find your Robots.txt path setting you can navigate to the Settings Application and then find URLs and SEO settings. We highly recommend this be set correctly.

From there you will want to find the Search engine optimization (SEO) section and verify that the robots.txt path setting is configured correctly.

Not sure what to do?

If you are ever unsure about making changes to your site, we encourage you to reach out to your Kentico Xperience Gold Partner. If you do not have a partner, then feel free contact the Constant Care For Kentico team to get connected with an expert.