One of the very first steps I take when onboarding a new client for AI visibility is a simple—but crucial—test: can AI bots even crawl your site?
If your site is blocking AI tools (even unintentionally), then all your optimization efforts won’t matter.
Let me show you what I mean.
👇 In the screenshots below, I tested a client’s site using Scrunch’s crawlability checker.


The first screen shows blocked access to multiple AI bots like GPTBot (OpenAI), ClaudeBot (Anthropic), and CCBot (Common Crawl).
The second screen shows the site after fixes were made—and now it’s open to all major AI crawlers.
These bots power tools like ChatGPT, Perplexity, and Meta AI. If they can’t crawl your content, your brand won’t exist in their responses.
Why this matters:
AI tools don’t index the entire internet. They rely on data scraped and structured by these bots. Blocking them = invisibility.
What I do:
- Run a full crawlability audit with tools like Scrunch or Robots.txt testers
- Check for blocked AI bots in robots.txt and server headers
- Fix or rewrite rules to allow safe access while maintaining privacy and control
- Re-test to confirm visibility
So, can the Ai tools see your site? Get a fast and free check here.