Robots.txt generator

Build a clean robots.txt with a visual rule builder, live validation, and path testing. Copy or download instantly.

1

Configure your robots.txt rules

Choose an industry template that matches your platform, then customize the crawl rules to control how search engines access your website.

Step 1a

Choose a preset template that matches your platform (WordPress, Shopify, etc.) to get started quickly with industry-standard rules.

Step 1b

Customize the rules by adding/removing paths, configuring user agents, and setting up sitemaps for your specific needs.

Industry presets

Quick-start templates for popular platforms and use cases.

WordPress

CMSACTIVE

Standard WordPress robots.txt configuration

Blocks: 8Allows: 2

Drupal

CMSACTIVE

Drupal CMS with standard protection rules

Blocks: 22Allows: 1

Joomla

CMSACTIVE

Joomla CMS with administrator protection

Blocks: 14Allows: 1

Ghost

CMSACTIVE

Ghost publishing platform configuration

Blocks: 4Allows: 3

Strapi

CMSACTIVE

Strapi headless CMS configuration

Blocks: 5Allows: 1

Contentful

CMSACTIVE

Contentful headless CMS setup

Blocks: 7Allows: 1

Visual rule builder

Configure crawl permissions, user agents, and sitemap locations.

2

Review and download your robots.txt

Preview your generated robots.txt file, check for validation warnings, then download it to upload to your website's root directory.

Live preview

User-agent: *
Disallow: /admin/
Disallow: /private/

Sitemap: https://example.com/sitemap.xml
robots.txt
Lines: 5
Characters: 93
Size: 93 bytes

Best Practices:

  • Place robots.txt in your site's root directory
  • Test your robots.txt with search console tools
  • Use specific paths rather than wildcards when possible
  • Include your sitemap URL for better crawling
3

Test your robots.txt implementation

After uploading robots.txt to your website's root directory (yoursite.com/robots.txt), test how search engines will interpret your rules for specific URLs.

Validate & test your rules

Test how search engines will interpret your robots.txt rules for specific URLs and user agents.

Testing Tips:
  • Test both allowed and blocked paths to verify rules
  • Remember that Allow rules can override Disallow rules
  • More specific rules take precedence over general ones
  • Test with different user agents to check behavior

Frequently Asked Questions

Get answers to common questions about robots.txt directives, user-agent targeting, crawl-delay settings, and how to test and deploy your robots.txt file.

Select your website platform (WordPress, Shopify, Next.js, etc.) to start with a preset, or build from scratch. Add Allow/Disallow rules by specifying paths you want to control. Set user-agent targets (like Googlebot), add crawl delays if needed, and include your sitemap URL. Click "Generate" to create your robots.txt file.

"Disallow" blocks search engines from crawling specific paths (e.g., "Disallow: /admin/"). "Allow" explicitly permits crawling, useful for overriding broader Disallow rules. For example, you might disallow "/wp-admin/" but allow "/wp-admin/admin-ajax.php" for functionality.

After generating, use the built-in URL tester to check if specific paths are blocked or allowed. Upload the file to your site root (yoursite.com/robots.txt), then test in Google Search Console's robots.txt Tester. Our generator includes testing instructions for your specific platform.

Use "*" to match any sequence of characters and "$" to match the end of a URL. Examples: "Disallow: /*.pdf$" blocks all PDF files, "Disallow: /*?*" blocks URLs with parameters. The generator includes wildcard examples and validates your syntax.

Use "*" for all search engines, or target specific bots like "Googlebot", "Bingbot", or "Facebookexternalhit". Most sites only need "*" rules. Specific user-agents are useful when you want different rules for different crawlers (e.g., stricter rules for aggressive bots).

Set crawl-delay only if your server is struggling with bot traffic. Most sites don't need it. If you do, start with a small delay. Too high delays can slow down indexing. The generator warns if your delay might hurt SEO.

User reviews

Recent ratings and feedback from tool users

Need technical SEO as a system, not a checklist?

I can implement technical SEO foundations across your site or app with monitoring, schema, and performance improvements that stay stable as you ship new pages.

Most popular

Book a strategy call

A short call to map the workflow and next step

Book now

Send a project brief

Get a clear proposal within 24 hours

Send brief

Get the systems guide

Short guide on dashboards, automation, and execution planning

Related resources

Articles on crawling, indexation, and technical SEO systems