Is Your Website Ready for the AI Web?
Check how AI crawlers, search engines, and agents see your website. Get a free readiness score in seconds.
AI Bot Analyzer
See which AI bots can access your site. Check robots.txt rules for GPTBot, ClaudeBot, Google-Extended, and 13 more.
FreeSitemap & Robots Health
Validate your sitemap.xml and robots.txt for AI crawler compatibility. Find missing directives and broken rules.
FreeDark AI Crawler Detector
Discover rogue AI crawlers scraping your content without permission. Check exposure to undisclosed training bots.
FreeMCP Endpoint Checker
Check if your site supports the Model Context Protocol for AI agent integration. Verify .well-known endpoints.
FreeAI Bot Exposure Score
Get an estimated risk score for how exposed your content is to AI training crawlers based on public signals.
FreeMaster State of AI
Explore our comprehensive library of guides, best practices, troubleshooting tips, and in-depth comparisons.
Explore Resources HubFrequently Asked Questions
What does the AI Readiness Score measure?
It analyzes your robots.txt rules, sitemap availability, llms.txt presence, MCP endpoints, and HTTPS security to score how well your website is prepared for AI crawlers and agents.
Is my website data stored?
No. We only read publicly available files (robots.txt, sitemap.xml). No data is stored, and analysis results are discarded after display.
What is GPTBot?
GPTBot is OpenAI's web crawler that collects data to improve ChatGPT models. You can control its access via your robots.txt file.
What is llms.txt?
llms.txt is a draft standard (similar to robots.txt) that provides structured guidance to AI language models about your website content, preferred summaries, and usage policies.
Should I block AI bots?
It depends on your goals. Blocking training bots prevents your content from being used in AI models. Blocking search bots may reduce your visibility in AI-powered search results like ChatGPT Search and Perplexity.
What is the MCP standard?
The Model Context Protocol (MCP) is an emerging standard for AI agents to interact with websites. It defines how AI agents discover capabilities, call APIs, and handle authentication.
How often should I check my AI readiness?
Every time you update your robots.txt, sitemap, or make major site changes. AI bot user-agent strings also change quarterly, so regular checks are recommended.
Is this tool free?
Yes. All 6 tools on State of AI are completely free with no API costs. Checks are based on publicly available file analysis.
What makes this different from regular SEO tools?
Traditional SEO tools check Google/Bing crawlers. State of AI specifically checks AI-era signals: GPTBot, ClaudeBot, llms.txt, MCP endpoints, and AI training opt-out status.
Can I use this for my clients' websites?
Absolutely. The tool only reads publicly available files and does not require any authentication or access to the target website's admin panel.