Robots Txt Object


Represents a robots.txt file entity that controls web crawler access to a website or subdomain. It defines crawling rules and directives for search engines and other automated agents, ensuring proper indexing behavior and preventing unauthorized access to specific site areas.

Properties
contentstringmaxLength 720000

Full text content of the robots.txt file.


defaultboolean

Whether this robots.txt file uses Wix's default content.


subdomainstringmaxLength 63

If applicable, target subdomain for the robots.txt file. For example, a site can have multiple subdomains like www.example.com, es.example.com, fr.example.com. Each subdomain can have its own robots.txt file with different rules. Default: 'www'

Did this help?