Robots.txt Generator

Create a robots.txt file to control how search engine crawlers access your site. Support for Googlebot, Bingbot, GPTBot, and more.

*
Googlebot
Bingbot
GPTBot
CCBot
ChatGPT-User
Google-Extended
Slurp
DuckDuckBot

Generated robots.txt

User-agent: *
Allow: /

How to Use Robots.txt Generator

  1. Select which crawlers you want to configure (Googlebot, Bingbot, GPTBot, etc.).
  2. For each crawler, choose whether to allow or disallow access.
  3. Add specific paths to allow or disallow if needed.
  4. Enter your sitemap URL to include it in the robots.txt.
  5. Copy or download the generated robots.txt file.