Robots.txt Inspector
Paste a site URL or robots.txt URL to inspect directives, groups, and common issues.
Loading inspector...
Results
Results will appear here after you inspect a robots.txt URL.
Robots.txt Inspector: Rule Validation and Risk Checks
Robots.txt Inspector is intended for validating directive logic, not just syntax, before deployment.
It helps identify crawl-control mistakes that can hide important pages or expose low-value endpoints.
- Test critical paths explicitly (homepage, article, category, API, media) against active rules.
- Look for conflicting allow/disallow directives with equal prefix length and resolve ambiguity.
- Confirm sitemap declarations in robots.txt match live sitemap URLs and status codes.
- Re-run checks after CDN, proxy, or framework routing changes to prevent silent regressions.