Test Your Robots.txt
Test Results
Test Your Robots.txt Before Deploying
Verify crawl rules work correctly to prevent accidentally blocking important pages from search engines
Before uploading your robots.txt file, it's crucial to test that your rules work as expected. A single mistake could block Google from accessing your entire site or important sections. This tool lets you paste your robots.txt content and test multiple URLs against it.
Why Test Your Robots.txt?
The robots.txt file uses a specific syntax with wildcards and patterns that can be tricky to get right. Testing ensures your Allow and Disallow rules work correctly for different user agents. This prevents common mistakes like accidentally blocking CSS/JS files, entire directories, or your sitemap.
Features
Multi-URL Testing
Test multiple URLs at once against your robots.txt rules.
User-Agent Selection
Test rules for specific bots like Googlebot, Bingbot, or all bots (*).
Instant Results
Get immediate feedback on whether a URL is allowed or blocked.
Client-Side
All testing happens in your browser. Your robots.txt content is never sent anywhere.
Testing Tips
- Always test with the specific user-agent you're targeting (e.g., Googlebot, Bingbot).
- Test edge cases like URLs with query parameters (?page=2) and trailing slashes.
- Verify that CSS, JS, and image files are not accidentally blocked.
- Test your sitemap URL to make sure it's accessible.
- Check that admin, login, and private pages are properly blocked.
- Remember that robots.txt rules are case-sensitive for the path.
Frequently Asked Questions
How does robots.txt matching work?
Robots.txt uses path-based matching. Disallow: /private/ blocks all URLs starting with /private/. The * wildcard matches any sequence of characters, and $ anchors a match to the end of the URL.
Does Allow take priority over Disallow?
Google's implementation gives priority to the more specific rule. If both Allow and Disallow match, the longer (more specific) pattern wins. If they're the same length, Allow takes priority.
Is robots.txt case-sensitive?
The User-agent field is case-insensitive, but path matching (Allow/Disallow) is case-sensitive. Disallow: /Private/ will not block /private/.