FindBest Tools

SEO Utility

Robots.txt Generator

Build a simple robots.txt file for crawlers with allow, disallow, crawl-delay, and sitemap directives in one place.

Generated robots.txt

User-agent: *
Allow: /
Disallow: /admin/
Disallow: /private/

Sitemap: https://example.com/sitemap.xml

Frequently asked questions

What does robots.txt do?

A robots.txt file gives crawling instructions to bots at the domain root, including which paths are allowed or disallowed and where the sitemap is located.

Does robots.txt block pages from the web?

No. It is a crawler directive, not an access-control mechanism. Sensitive content should be protected with authentication or removed entirely.

Where should robots.txt live?

It should be served from the root of the domain, for example https://example.com/robots.txt.

More utility tools

All utility tools →