Robots.txt Generator

Create and customize robots.txt files to control search engine crawling

Optional: Add your XML sitemap location
Disallow:
seconds

Generated Robots.txt

About Robots.txt

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests.

Common Directives

Frequently Asked Questions

What should I block in robots.txt?
Common paths to block include:
  • Admin areas and login pages
  • Search result pages
  • Private or temporary files
  • Duplicate content pages
  • Development or staging environments
Where should I place the robots.txt file?
The robots.txt file must be placed in the root directory of your website (e.g., https://example.com/robots.txt). It won't work if placed in a subdirectory.
Is robots.txt case sensitive?
Yes, robots.txt directives are case-sensitive. For example, "Disallow" and "disallow" are treated differently. Always use the proper case: "User-agent", "Disallow", "Allow", "Sitemap".