Robots.txt Generator
Create a properly formatted robots.txt file to control how search engines crawl and index your website. Protect sensitive pages, manage bot access, and optimise your crawl budget effectively.
Configuration
User Agent
Use * for all search engines or specify: Googlebot, Bingbot, etc.
When checked, search engines can crawl all pages (recommended for most websites)
Disallow Specific Paths (optional)
Enter one path per line (e.g., /admin/, /private/)
Crawl Delay (seconds, optional)
Time to wait between requests (use only if your server needs it)
Sitemap URL (optional)
Help search engines find your sitemap
Common Use Cases
- •Block admin panels:
/admin/ - •Block search results:
/search/ - •Block API endpoints:
/api/ - •Block temporary files:
/tmp/
Generated robots.txt
Upload to your website root
# robots.txt generated by SEO by Ishaan # https://www.seobyishaan.com/tools/robots-generator User-agent: * Allow: /
How to Use
- 1.Download or copy the generated robots.txt file
- 2.Upload it to the root directory of your website
- 3.Verify it's accessible at: yoursite.com/robots.txt
- 4.Test it using Google Search Console
Important Notice
The robots.txt file does NOT guarantee that pages won't be indexed. For sensitive content, use password protection or noindex meta tags instead.