Save 15% with code SAVE15

Bot Blocking

Block specific AI bots with one click. Hard 403 response before WordPress loads.

Last updated Feb 21, 2026

Overview

The Bots tab gives you granular control over which AI bots can access your site. You can see every detected bot with its visit count and category, then decide on a per-bot basis whether to allow or block access.

Viewing detected bots

Navigate to CitedPro → Bots to see a list of all AI bots that have visited your site. Each entry shows:

  • Bot name: The identified bot (e.g., GPTBot, ClaudeBot)
  • Category: AI Search Crawler, AI Agent, AI Data Scraper, etc.
  • Hit count: Total visits recorded for this bot
  • Status: Whether the bot is currently allowed or blocked

Blocking individual bots

To block a specific bot:

  1. Find the bot in the Bots list
  2. Click the Block button next to it
  3. The bot is immediately blocked

When a bot is blocked, two things happen:

  • Hard block: The bot receives a 403 Forbidden response with an X-Blocked-By: CitedPro header. This runs at plugins_loaded priority 1, before any WordPress output or headers are sent.
  • robots.txt: A Disallow: / rule is added for the bot's user agent in your robots.txt file. This is a belt-and-suspenders approach: well-behaved bots respect robots.txt, while the hard block catches those that do not.

CitedPro caches blocked bot user-agent patterns in the cited_blocked_bot_patterns option for fast matching on every request, so blocking does not add database queries to your page loads.

Bulk actions by category

Instead of blocking bots one at a time, you can block or unblock all bots in an entire category with a single click. For example, you might choose to block all AI Data Scrapers while keeping AI Search Crawlers allowed.

Available categories for bulk actions:

  • AI Search Crawler
  • AI Agent
  • AI Data Scraper
  • AI Assistant
  • Undocumented AI Agent

Unblocking bots

  1. Find the blocked bot in the list
  2. Click the Unblock button
  3. The bot is immediately removed from the block list
  4. The robots.txt Disallow rule is automatically removed

Bot activity details

Click any bot name to open a modal showing its recent visit details:

  • Pages the bot has visited
  • Visit timestamps
  • Full user-agent string
  • IP addresses

This helps you understand what each bot is doing on your site before deciding whether to block it.

robots.txt integration

When you block a bot, CitedPro automatically adds a corresponding Disallow: / directive in your robots.txt for that bot's user agent. This provides two layers of protection:

  1. robots.txt directive: Tells polite bots not to crawl (many AI bots respect this)
  2. Server-side block: Stops the request with a 403 response regardless of whether the bot respects robots.txt

Both layers are managed automatically. When you unblock a bot, the robots.txt rule is removed as well.

Strategy: which bots to block

Not all AI bots are equal. Here is a general framework for deciding which to block:

Consider allowing

  • AI Search Crawlers (GPTBot, ClaudeBot, PerplexityBot): These bots index your content so AI assistants can recommend your business. Blocking them means those platforms cannot cite you.
  • AI Agents from major platforms (Amazonbot, Applebot-Extended): These help your content appear in voice assistants and platform-specific AI features.

Consider blocking

  • AI Data Scrapers: These bots collect content for training data rather than for search or recommendations. Blocking them has no impact on your AI search visibility.
  • Undocumented AI Agents: Bots without clear documentation about their purpose. Block if you prefer a cautious approach.
  • Aggressive crawlers: Any bot making an unusually high number of requests (check the hit count and activity modal).

Important

Blocking AI search crawlers means those AI platforms cannot recommend your business in their responses. For example, blocking ClaudeBot means Claude cannot cite your content. Block search crawlers only if you have a specific reason to deny access.

Tip

A good starting point is to allow all AI Search Crawlers and AI Agents, block AI Data Scrapers, and review Undocumented AI Agents on a case-by-case basis.