Robots

The [robots] table is used to write a robots.txt file for the website.

The robots file generation is determined by the profiles filter with the exception that if sitemaps are being generated then profiles will be ignored and the robots file will be generated. This is because the robots file is used to instruct bots where to find the sitemap files so it must exist if sitemap generation is enabled.

Rules can be added to the [robots] table by specifying the user agent followed by allow and disallow patterns, for example:

[robots."*"]
allow = ["*"]

[robots."googlebot"]
allow = ["/"]
disallow = ["/images"]

Back to Settings