Changelog, New Features and Updates

Complete version history and detailed changelog for all CrawlerCheck releases.

v1.0.0
v1.1.0
v1.1.1
v1.2.0
v1.3.0
v1.3.1
v1.3.2
v1.3.3
v1.4.0
v1.5.0
v1.5.1
v1.5.2
v1.5.3
v1.5.4
What's New in v1.5.4 -
  • Crawler Correction: Updated PanguBot's information to correctly identify its operator as Huawei and moved it to the AI Bots category.
  • Operator Updates: Enhanced the Huawei operator profile with details about PanGu Large Language Models and their associated crawlers.
What's New in v1.5.3 -
  • Sitemap Improvements: Added an automatically updated Operators Directory section and included version information in the page title.
  • Security & Stability: Addressed potential security vulnerabilities in dependencies and hardened the release workflow.
  • Safety System Update: Renamed the 'Unsafe' status to 'Aggressive' to more accurately reflect crawler behavior (pushy but not necessarily malicious).
  • Data Freshness: Added precise 'Added On' and 'Last Updated' dates to all crawler and operator profiles to ensure data transparency.
  • Directory Expansion: Added 25+ new crawlers and bots to the database, including the latest AI models and scrapers.
What's New in v1.5.2 -
  • Operators Directory: Introduced a new section to browse web crawlers by their operating entity (e.g., Google, OpenAI), making it easier to see who owns which bots.
  • Operator Profiles: Detailed profile pages for each operator showing all their bots, safety statistics, and direct links to official documentation.
What's New in v1.5.1 -
  • Enhanced Crawler Profiles: Added detailed "Blocking Impact" and "Common Use Cases" sections to every crawler page to help you make informed decisions.
  • Improved Navigation: Added a Technical Details sidebar for easier navigation on crawler detail pages.
What's New in v1.5.0 -
  • Crawler Directory: Launched a searchable database of 150+ web crawlers, AI bots, and SEO tools with real-time filtering.
  • Bulk Robots.txt Generator: Instantly generate "Allow" or "Block" rules for all currently filtered bots in one click—perfect for creating custom blocklists.
  • Directory-Wide URL Checker: Check if a specific URL is blocked by any of the currently viewed bots simultaneously.
  • Detailed Bot Profiles: Every bot now has a dedicated page featuring its operator, purpose, safety verdict, and specific user-agent strings.
  • Dynamic Filtering: Filter bots by Category (AI, Search, SEO) and Safety Rating (Safe, Aggressive, Malicious) to find exactly what you need.
  • Responsive & Accessible: A fully responsive layout with optimized mobile views and improved accessibility for screen readers.
What's New in v1.4.0 -
  • The "Deep Inspection" Update: A major architectural upgrade bringing the engine in line with strict modern bot standards.
  • Source-Specific Blocking: The tool now distinguishes between blocks originating from HTTP Headers (HeaderNoIndex) versus HTML Meta Tags (MetaNoIndex) for faster debugging.
  • Crawl Budget Indicators: Added visual badges to instantly highlight categories where bots are allowed to visit but instructed not to index—a primary source of crawl budget waste.
  • RFC 9309 Strict Compliance: The decision engine has been rewritten to perfectly mimic the complex "Longest Match" and priority logic used by Googlebot and GPTBot.
  • Expanded Crawler Database: Added support for 50+ new User-Agents, with a specific focus on the latest generation of AI Crawlers (Anthropic, OpenAI, Perplexity).
  • Deep Header Analysis: The engine now processes multi-layered response headers to catch edge-case directives often missed by standard SEO tools.
  • Advanced Security: Implemented a new custom network layer with hardened SSRF protection and improved stability.
What's New in v1.3.3 -
  • Privacy Policy and Cookie Policy: Added privacy policy and cookie policy pages to comply with GDPR and CCPA regulations.
  • Security: Added security measures to protect the website from attacks.
  • Performance: Improved performance of the website.
  • Code quality: Improved code quality of the website.
  • All previous features and security measures remain unchanged.
What's New in v1.3.2 -
  • Dark theme support: Added beautiful dark mode with automatic system preference detection and manual toggle option.
  • Performance optimizations: Enhanced caching strategy with improved service worker and Nginx configuration for faster page loads.
  • Better user experience: Improved preloading behavior for smoother navigation.
  • Offline support improvements: Enhanced service worker with better error handling and offline fallback pages.
  • Code quality improvements: Centralized version management system for more reliable deployments.
  • All previous features and security measures remain unchanged.
What's New in v1.3.1 -
  • Performance optimizations: improved backend efficiency.
  • DNS caching: Added intelligent caching for faster SSRF protection checks.
  • Frontend improvements: Optimized URL validation for better responsiveness.
  • All security features from v1.3.0 remain active and unchanged.
What's New in v1.3.0 -
  • Security and stability improvements: Enhanced backend protection against abuse and malicious requests.
  • Rate limiting: Added safeguards to prevent excessive or automated use, ensuring fair access for all users.
  • All previous features and user experience remain unchanged for normal use.
What's New in v1.2.0 -
  • User-agent categorization system: Results are now organized into 8 distinct categories (Search Engines, AI Bots, SEO Tools, Social Bots, etc.) for easier analysis.
  • Enhanced crawler detection: Added 20+ new user agents including the latest AI bots (MistralAI, Perplexity), SEO tools (SearchmetricsBot, Screaming Frog), and Yahoo crawlers.
  • URL sharing and bookmarking: Results can now be shared via direct links with query parameters.
  • Improved results navigation: Category sections now use SEO-friendly anchors for better organization.
  • Performance improvements: Faster response times and better handling of large robots.txt files.
  • Deeper crawler analysis: Better detection of crawl delays and user-agent specific rules.
  • Improved error handling: More detailed feedback when issues occur during analysis.
  • Updated API response structure: Results now include categorized data for better integration with other tools.
  • Enhanced caching system: More efficient result storage and retrieval.
What's New in v1.1.1 -
  • Backend HTTP client now mimics real browsers more closely for improved compatibility with strict sites.
  • Automatic cookie handling and support for HTTP/2 enabled.
  • Referer header now transparently set to https://crawlercheck.com for all requests.
  • Improved diagnostics: HTTP status code and response body are logged for non-2xx responses to help troubleshoot blocks.
  • All previous features and compatibility with simpler sites are preserved.
What's New in v1.1.0 -
  • Robots.txt parser now fully supports group precedence and ignores duplicate user-agent groups per standard.
  • Wildcard and end-anchor ($) rules are handled per RFC and Googlebot behavior.
  • Query string rules (e.g., Disallow: /*?SID=) now match URLs with query parameters.
  • Improved error handling and user feedback in both backend and frontend.
  • Accessibility and mobile responsiveness improvements.
  • Production logging and error reporting polish.
Going Live v1.0.0 -
  • Initial release of CrawlerCheck with crawler detection and robots.txt parsing functionality.
  • Core version of CrawlerCheck is v1.0.0

If you would like to suggest a new feature or report a bug, please use our official X (Twitter) account: @crawlercheck

Featured & Supported

We are proud to be featured on major platforms! Support CrawlerCheck by checking out our listings below and helping us spread the word.

CrawlerCheck - Instantly see if you're blocking search engines or AI bots | Product Hunt CrawlerCheck - Featured on Startup Fame Featured on toolfame.com