Skip to main content

Checkup Documentation

Robots.txt path

cms-settingskey-robotstxt

severity-high

Summary

robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests from many automated tools. 

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or password-protect your page.

Google Search Console Help


Check Logic

Constant Care for Kentico will ensure the Robots.txt path setting has a value (default)

You can manage your settings for this checkup in the Constant Care for Kentico admin settings.


Verifying The Check

To find your Robots.txt Path Setting you can navigate to the Settings Application and then find URLs and SEO settings. We highly recommend this be set correctly.

From there you will want to find the Search engine optimization (SEO) section and verify that the Robots.txt Path Setting is configured correctly.

Changes to your site should only be made by an experienced Kentico Xperience developer. If you need assistance in making these changes please reach out to the Toolkit For Kentico team to be connected with a Kentico Xperience partner.