Crawl Rules.

Revealed.

Robots.txt Checker

Is this page blocked? One click fetches robots.txt, parses every rule, tells you if the current URL is allowed.

Full robots.txt analysis.

Robots.txt Checker
×
Current page
/blog/seo-guide
ALLOWED
User-agent *
Disallow /admin/
Disallow /wp-admin/
Allow /
Sitemap https://example.com/sitemap.xml

Parses All Directives

World First Browser OS

Launch from Quick Launch

150+ Tools

Why Check
Robots.txt?

Invisible blocks kill rankings.

A single Disallow line can remove your entire site from Google. Developers add rules during staging. Agencies inherit old configs. Plugin migrations overwrite settings.

The result: pages you want indexed get blocked. Pages that should be blocked get crawled.

The fix: Check robots.txt on any URL. See exactly which rules apply. Know if Googlebot can reach the page you’re looking at—instantly. Crawl access is one check. The other eight Technical SEO Tools cover schema, links, speed, and authority.

Part of the
Technical SEO Toolkit

Tools that work together for faster technical audits.

Structured Data Tester

Validate schema markup on any page

XML Sitemap Validator

Check sitemap existence and status instantly

Broken Link Checker

Scan any page for broken links

Domain Authority Checker

Quick DA and PA lookup for any domain

Wayback Machine Checker

Check archived versions of any URL

Website Tracker Detector

Detect analytics and tracking scripts instantly

Semantic HTML Analyzer

Audit heading structure and semantic markup

SEO Tests

Nine SEO tests in one panel, one click

One extension. Every technical SEO tool in the same browser.