Analyze your sitemap.xml and robots.txt. Check crawl rules, count indexed URLs, and find configuration issues.
Robots.txt Check
100% Free
Full Analysis
Sitemap FAQ
What is a sitemap.xml file?
A sitemap.xml is a file that lists all the important URLs on your website. It helps search engines discover, crawl, and index your pages more efficiently.
What is robots.txt?
robots.txt is a file that tells search engine crawlers which pages they can or cannot access. It helps manage crawl budget and prevent indexing of private pages.
Do I need both a sitemap and robots.txt?
Yes! While not strictly required, having both is considered best practice. Your robots.txt should reference your sitemap URL.
How many URLs can a sitemap have?
A single sitemap can have up to 50,000 URLs and must be under 50MB. For larger sites, use a sitemap index that references multiple sitemaps.
Want the full picture?
Run a comprehensive AI readiness audit including SEO, content helpfulness, schema, trust signals, and 50+ more factors.