Back to SEO Tools

Robots.txt Generator

Create a robots.txt file to control search engine crawlers

Rules
Rule 1
Output (robots.txt)Ready ✓

What is robots.txt?

The robots.txt file tells search engine crawlers which pages or sections of your site should or should not be crawled. It is placed in the root directory of your website (e.g., https://example.com/robots.txt) and is one of the first files crawlers check before indexing your site.

Important: robots.txt is a suggestion, not a security mechanism. Well-behaved crawlers like Googlebot will respect it, but malicious bots may ignore it entirely.

Common Directives

User-agent specifies which crawler the rules apply to (* means all). Allow permits crawling of specific paths. Disallow blocks crawling of specific paths. Sitemap points crawlers to your XML sitemap for better discovery.