Robots.txt Generator

Free Robots.txt Generator

Robots.txt is a file that tells search engine crawlers which pages or files they can or cannot request from your site. This tool helps you create a properly formatted robots.txt file following SEO best practices and the official Robots Exclusion Protocol.

Default – All Robots are:

Crawl Delay

Sitemap URL

Optional. Enter the full URL to your sitemap.xml file.

Search Engine Specific Rules

Set custom rules for specific search engine bots. "Default" will not add any specific rule.

Restricted Directories

Path is relative to root and must end with a trailing slash "/"

Scroll to Top