Free tool · No signup
Robots.txt Validator
Paste your site URL. We'll fetch /robots.txt, count directives, and flag syntax problems and the classic "oops, we blocked Google" mistake.
Common questions
What does Disallow: / actually do?+
If under User-agent: *, it tells every crawler 'do not crawl any URL on this site'. We see this most often when a developer forgot to remove it after launching from a staging environment.
Should I have a robots.txt at all?+
Yes. Even a minimal file (User-agent: * + Sitemap: <your sitemap URL>) helps. The absence of robots.txt is treated as 'allow everything' but you lose the chance to declare your sitemap.
Where should the Sitemap directive go?+
Anywhere in the file, but conventionally at the bottom. You can list multiple sitemaps if you have a sitemap index.
Will robots.txt hide pages from public access?+
No. It only asks polite crawlers not to index. Anyone with the URL can still load the page directly. Use HTTP auth or noindex meta tags for actual privacy.
Get the full picture
Robots.txt is one of 30+ checks in the Flatline audit. Get a Visibility Score 0-100 with categorised fixes for performance, on-page, mobile, security and more.