Represents a robots.txt
file entity that controls web crawler access to a website or subdomain. It defines crawling rules and directives for search engines and other automated agents, ensuring proper indexing behavior and preventing unauthorized access to specific site areas.
Full text content of the robots.txt
file.
Whether this robots.txt
file uses Wix's default content.
If applicable, target subdomain for the robots.txt
file. For example, a site can have multiple subdomains like www.example.com
, es.example.com
, fr.example.com
. Each subdomain can have its own robots.txt
file with different rules. Default: 'www'