As of 2025-2026, AI companies actively crawl the web to train models and power AI search. Managing these crawlers via robots.txt is a critical technical SEO consideration.
| Crawler | Company | robots.txt token | Purpose |
| GPTBot | OpenAI | GPTBot | Model training | | ChatGPT-User | OpenAI | ChatGPT-User | Real-time browsing | | ClaudeBot | Anthropic | ClaudeBot | Model training | | PerplexityBot | Perplexity | PerplexityBot | Search index + training | | Bytespider | ByteDance | Bytespider | Model training |
Technical SEO audit across 8 categories: crawlability, indexability, security, URL structure, mobile, Core Web Vitals, structured data, and JavaScript rendering. Use when user says "technical SEO", "crawl issues", "robots.txt", "Core Web Vitals", "site speed", or "security headers". Source: agricidaniel/claude-seo.