SEO · Crawl Control

Control who crawls your site.

Build a valid robots.txt file visually — set allow/disallow rules per crawler, block AI bots, add your sitemap, and download the file ready to deploy.

User-agent:
User-agent:
Global Settings
robots.txt
User-agent: *
Allow: /
Disallow: /admin
Disallow: /private
User-agent: GPTBot
Disallow: /
Host: https://yoursite.com
Sitemap: https://yoursite.com/sitemap.xml
Deploy tip: Upload robots.txt to your domain root — it must be accessible at https://yoursite.com/robots.txt. Verify with Google Search Console after deploying.
← Back to all toolslocal-first, always