What does robots.txt do?
A robots.txt file gives crawling instructions to bots at the domain root, including which paths are allowed or disallowed and where the sitemap is located.
SEO Utility
Build a simple robots.txt file for crawlers with allow, disallow, crawl-delay, and sitemap directives in one place.
User-agent: * Allow: / Disallow: /admin/ Disallow: /private/ Sitemap: https://example.com/sitemap.xml
A robots.txt file gives crawling instructions to bots at the domain root, including which paths are allowed or disallowed and where the sitemap is located.
No. It is a crawler directive, not an access-control mechanism. Sensitive content should be protected with authentication or removed entirely.
It should be served from the root of the domain, for example https://example.com/robots.txt.
Calculate Body Mass Index (BMI) and find your ideal weight range using metric or imperial units.
Calculate exact age in years, months, days, weeks, and total days from a birth date to any target date.
Calculate gratuity, full bill total, and the per-person share from a bill amount and tip percentage.
Generate a basic sitemap.xml file from a site URL and a list of paths or full URLs.
Decode JWT header and payload segments locally in the browser for debugging and inspection.
Evaluate arithmetic, trig functions, logarithms, factorials, and scientific expressions directly in the browser.