Robots.txt path setting
severity-medium
cms-settingskey-robotstxt
Summary
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests from many automated tools.
A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or password-protect your page.
- Google Search Console Help
Not sure what to do?
If you are ever unsure about making changes to your site, we encourage you to reach out to your
Kentico Xperience Gold Partner. If you do not have a partner,
then feel free
contact the Constant Care For Kentico team to get connected
with an expert.