Skip to main content

CrawlerDetect is a Python library designed to identify bots, crawlers, and spiders by analyzing their user agents.

Project description

About CrawlerDetect

This is a Python wrapper for CrawlerDetect a web crawler detection library. It helps identify bots, crawlers, and spiders using the user agent and other HTTP headers. Currently, it can detect over 3,678 bots, spiders, and crawlers.

How to install

$ pip install crawlerdetect

How to use

Variant 1

from crawlerdetect import CrawlerDetect
crawler_detect = CrawlerDetect()
crawler_detect.isCrawler('Mozilla/5.0 (compatible; Sosospider/2.0; +http://help.soso.com/webspider.htm)')
# true if crawler user agent detected

Variant 2

from crawlerdetect import CrawlerDetect
crawler_detect = CrawlerDetect(user_agent='Mozilla/5.0 (iPhone; CPU iPhone OS 7_1 like Mac OS X) AppleWebKit (KHTML, like Gecko) Mobile (compatible; Yahoo Ad monitoring; https://help.yahoo.com/kb/yahoo-ad-monitoring-SLN24857.html)')
crawler_detect.isCrawler()
# true if crawler user agent detected

Variant 3

from crawlerdetect import CrawlerDetect
crawler_detect = CrawlerDetect(headers={'DOCUMENT_ROOT': '/home/test/public_html', 'GATEWAY_INTERFACE': 'CGI/1.1', 'HTTP_ACCEPT': '*/*', 'HTTP_ACCEPT_ENCODING': 'gzip, deflate', 'HTTP_CACHE_CONTROL': 'no-cache', 'HTTP_CONNECTION': 'Keep-Alive', 'HTTP_FROM': 'googlebot(at)googlebot.com', 'HTTP_HOST': 'www.test.com', 'HTTP_PRAGMA': 'no-cache', 'HTTP_USER_AGENT': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/28.0.1500.71 Safari/537.36', 'PATH': '/bin:/usr/bin', 'QUERY_STRING': 'order=closingDate', 'REDIRECT_STATUS': '200', 'REMOTE_ADDR': '127.0.0.1', 'REMOTE_PORT': '3360', 'REQUEST_METHOD': 'GET', 'REQUEST_URI': '/?test=testing', 'SCRIPT_FILENAME': '/home/test/public_html/index.php', 'SCRIPT_NAME': '/index.php', 'SERVER_ADDR': '127.0.0.1', 'SERVER_ADMIN': 'webmaster@test.com', 'SERVER_NAME': 'www.test.com', 'SERVER_PORT': '80', 'SERVER_PROTOCOL': 'HTTP/1.1', 'SERVER_SIGNATURE': '', 'SERVER_SOFTWARE': 'Apache', 'UNIQUE_ID': 'Vx6MENRxerBUSDEQgFLAAAAAS', 'PHP_SELF': '/index.php', 'REQUEST_TIME_FLOAT': 1461619728.0705, 'REQUEST_TIME': 1461619728})
crawler_detect.isCrawler()
# true if crawler user agent detected

Output the name of the bot that matched (if any)

from crawlerdetect import CrawlerDetect
crawler_detect = CrawlerDetect()
crawler_detect.isCrawler('Mozilla/5.0 (compatible; Sosospider/2.0; +http://help.soso.com/webspider.htm)')
# true if crawler user agent detected
crawler_detect.getMatches()
# Sosospider

Get version of the library

import crawlerdetect
crawlerdetect.__version__

Contributing

The patterns and testcases are synced from the PHP repo. If you find a bot/spider/crawler user agent that crawlerdetect fails to detect, please submit a pull request with the regex pattern and a testcase to the upstream PHP repo.

Failing that, just create an issue with the user agent you have found, and we'll take it from there :)

Development

Setup

$ poetry install

Running tests

$ poetry run pytest

Update crawlers from upstream PHP repo

$ ./update_data.sh

Bump version

$ poetry run bump-my-version bump [patch|minor|major]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crawlerdetect-0.2.7.tar.gz (16.5 kB view details)

Uploaded Source

Built Distribution

crawlerdetect-0.2.7-py3-none-any.whl (16.1 kB view details)

Uploaded Python 3

File details

Details for the file crawlerdetect-0.2.7.tar.gz.

File metadata

  • Download URL: crawlerdetect-0.2.7.tar.gz
  • Upload date:
  • Size: 16.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.7 Darwin/24.0.0

File hashes

Hashes for crawlerdetect-0.2.7.tar.gz
Algorithm Hash digest
SHA256 1d2e044d04bd9b5b0da2a9b2ab5229a9071ec027c089a53c2ff24204b0ca7c78
MD5 b6182950e1aa6a85b5c731efe0b860f7
BLAKE2b-256 2a4168cb8fba510c992d24aaf7c10c2a2d2cc026163242554afff085e490c9f8

See more details on using hashes here.

File details

Details for the file crawlerdetect-0.2.7-py3-none-any.whl.

File metadata

  • Download URL: crawlerdetect-0.2.7-py3-none-any.whl
  • Upload date:
  • Size: 16.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.7 Darwin/24.0.0

File hashes

Hashes for crawlerdetect-0.2.7-py3-none-any.whl
Algorithm Hash digest
SHA256 87efffd5e084e27e9d3df750b5b6d368be32cf914798b144989f11faeaeec07a
MD5 ef477e402cc48486cd6c6d44514a013d
BLAKE2b-256 01dd0e3eee3c127e1f28b5364da04a732e2faf7c5edc52a0b191b95df14f416e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page