Search Engines Directory & Bot Database

Browse 22 active Search Engines in our database. Get detailed profiles, copy robots.txt rules, and check if your URL is allowed or blocked by them.

Viewed by category:
Viewed by safety:

Currently viewing 12 of 22 crawlers and bots

What do these safety ratings mean?

All Filter by safety to see specific recommendations for each category.

Applebot
Safe
Search Engines

Applebot is the web crawler for Apple. It powers Siri and Spotlight suggestions, as well as Apple Intelligence features.

Search Engines

Applebot-Extended is a secondary user-agent used by Apple. It allows web publishers to control how their content is used...

Search Engines

AspiegelBot is a web crawler associated with Huawei's Petal Search ecosystem, operating primarily from Ireland.

Search Engines

Baiduspider is the official web crawler for Baidu, China's leading search engine. It is essential for having your websit...

Bingbot
Safe
Search Engines

Bingbot is the standard web crawler for Microsoft's Bing search engine. It allows your website to appear in Bing search ...

Search Engines

DuckDuckBot is the web crawler for DuckDuckGo. While DuckDuckGo largely sources results from Bing, this bot is used for ...

Googlebot
Safe
Search Engines

Googlebot is Google's primary web crawling bot. It discovers and indexes new and updated pages to be added to the Google...

Mojeek
Safe
Search Engines

Mojeek is an independent, privacy-focused search engine based in the UK. It builds its own index rather than relying on ...

MojeekBot
Safe
Search Engines

The specific user-agent string for Mojeek's web crawler.

PetalBot
Safe
Search Engines

PetalBot is the web crawler for Petal Search, a search engine developed by Huawei. It is significant for reaching mobile...

Search Engines

A specialized crawler for Seznam.cz, the most popular search engine in the Czech Republic, used for checking homepage up...

Slurp
Safe
Search Engines

Yahoo! Slurp is the crawler for Yahoo Search. Although Yahoo is now powered by Bing, Slurp is still active for certain Y...

Check URL for Search Engines

Verify if the crawlers currently in view are allowed or blocked on a specific URL.

https://

Enter URL to check score.

0% Blocked

⚠️ Caution: Advanced Configuration

Modifying your robots.txt file effectively controls who can access your website. Incorrect rules can accidentally de-index your entire site from Search Engines like Google. This tool generates valid syntax rules based on your selection. It does not analyze your specific website needs.

We strongly suggest testing any changes in Google Search Console or with CrawlerCheck before deploying to production.

Generated robots.txt snippet for the currently viewed bots 22

Select one of the options below to Disallow or Allow the bots.

Generating rules to BLOCK all bots currently in the list.

Review the list above. We recommend blocking bots marked as 'Unsafe' and carefully evaluating the bots marked as 'Caution'.

This is a live generated robots.txt based on the filters you selected above.

User-agent: applebot
Disallow: /
User-agent: applebot-extended
Disallow: /
User-agent: aspiegelbot
Disallow: /
User-agent: baiduspider
Disallow: /
User-agent: bingbot
Disallow: /
User-agent: duckduckbot
Disallow: /
User-agent: googlebot
Disallow: /
User-agent: mojeek
Disallow: /
User-agent: mojeekbot
Disallow: /
User-agent: petalbot
Disallow: /
User-agent: seznamhomepagecrawler
Disallow: /
User-agent: slurp
Disallow: /
User-agent: teoma
Disallow: /
User-agent: yahoo-blogs
Disallow: /
User-agent: yahoo-feedseeker
Disallow: /
User-agent: yahoo-mmcrawler
Disallow: /
User-agent: yahooseeker
Disallow: /
User-agent: yandex
Disallow: /
User-agent: yandexadditional
Disallow: /
User-agent: yandexadditionalbot
Disallow: /
User-agent: yandexbot
Disallow: /
User-agent: baidu
Disallow: /

Copy and paste these rules into your website's robots.txt file to block the identified bots.

Resource & Impact Analysis

Managing bot traffic is about more than just security. It's about optimizing your infrastructure and protecting your digital assets. Unchecked crawler activity can have significant downstream effects on your website's performance and business metrics.

📉 Server Load & Bandwidth

Every request from a bot consumes CPU cycles, RAM, and bandwidth. Aggressive scrapers can simulate a DDoS attack, slowing down your site for real human users and increasing your hosting costs, especially on metered cloud platforms.

💰 Crawl Budget Waste

Search engines like Google assign a "Crawl Budget" to your site. A limit on how many pages they will crawl in a given timeframe. If low-value bots clog your server queues, Googlebot may reduce its crawl rate, delaying the indexing of your new content.

🤖 AI & Data Privacy

Modern AI bots (like GPTBot and CCBot) scrape your content to train Large Language Models. While not malicious, they use your intellectual property without providing traffic back. Blocking them allows you to opt-out of having your data used for AI training.

🕵️ Competitive Intelligence

Many "SEO Tools" and commercial scrapers are used by competitors to monitor your pricing, copy your content strategy, or analyze your site structure. Restricting these bots protects your business intelligence.

Understanding Web Crawlers & Bots

Web crawlers (also known as spiders or bots) are automated software programs that browse the internet. CrawlerCheck classifies them into distinct categories to help you decide which ones to allow and which to block.

Search Engines Bots

Bots like Googlebot and Bingbot are essential for your website's visibility. They index your content so it appears in search results. Blocking these will remove your site from search engines.

AI Data Scrapers

Bots like GPTBot (OpenAI), ClaudeBot (Anthropic) and PerplexityBot (PerplexityAI) crawl the web to collect data for training Large Language Models (LLMs). Blocking them prevents your content from being used to train AI, but does not affect your search rankings.

SEO Tools & Scrapers

Marketing tools like Ahrefs and Semrush scan your site to analyze backlinks and SEO health. While useful for SEO audits, aggressive scrapers can consume server bandwidth and impact performance.

Featured & Supported

We are proud to be featured on major platforms! Support CrawlerCheck by checking out our listings below and helping us spread the word.

CrawlerCheck - Instantly see if you're blocking search engines or AI bots | Product Hunt CrawlerCheck - Featured on Startup Fame Featured on toolfame.com