Free Tool

Robots.txt Generator

Generate a valid robots.txt file for your website.

Crawl rules

Output

User-agent: *
Allow: /
Disallow: /api/
Disallow: /dashboard/

What is robots.txt?

The robots.txt file tells search engine crawlers which parts of your site they can and cannot access. It lives at the root of your domain (e.g. example.com/robots.txt). Without it, crawlers will try to access everything — including admin pages, API routes, and staging content you don't want indexed.

Always include a Sitemap: directive pointing to your XML sitemap. This helps search engines discover all your pages efficiently.

Want a full audit?

SEOLint checks robots.txt, sitemap, and 40+ other issues automatically.

Scan my website