Crawl Rules.
Revealed.
Robots.txt Checker
Is this page blocked? One click fetches robots.txt, parses every rule, tells you if the current URL is allowed.
Full robots.txt analysis.
Parses All Directives
World First Browser OS
Launch from Quick Launch
150+ Tools
Why Check
Robots.txt?
Invisible blocks kill rankings.
A single Disallow line can remove your entire site from Google. Developers add rules during staging. Agencies inherit old configs. Plugin migrations overwrite settings.
The result: pages you want indexed get blocked. Pages that should be blocked get crawled.
The fix: Check robots.txt on any URL. See exactly which rules apply. Know if Googlebot can reach the page you’re looking at—instantly. Crawl access is one check. The other eight Technical SEO Tools cover schema, links, speed, and authority.
Part of the
Technical SEO Toolkit
Tools that work together for faster technical audits.
Validate schema markup on any page
Check sitemap existence and status instantly
Scan any page for broken links
Quick DA and PA lookup for any domain
Check archived versions of any URL
Detect analytics and tracking scripts instantly
Audit heading structure and semantic markup
Nine SEO tests in one panel, one click
One extension. Every technical SEO tool in the same browser.
