Skip to main content

Tiny, zero-dependency crawler detection via regex.

Project description

is-crawler

Tiny, zero-dependency Python library that detects bots and crawlers from user-agent strings. Fast, lightweight, and ready to drop into any web app or API.

Install

pip install is-crawler

Usage

from is_crawler import is_crawler

is_crawler("Googlebot/2.1 (+http://www.google.com/bot.html)")  # True
is_crawler("Mozilla/5.0 (Windows NT 10.0; Win64; x64) Chrome/120.0.0.0 Safari/537.36")  # False

Works great as middleware, rate-limiter input, or analytics filter:

from is_crawler import is_crawler

@app.before_request
def block_bots():
    if is_crawler(request.headers.get("User-Agent", "")):
        abort(403)

How it works

Four fast regex checks, no database or external lookups:

  1. Bot signals -- common keywords (bot, crawl, spider, scrape, ...), URL/email patterns, headless
  2. Missing browser signature -- real browsers always include engine tokens like WebKit, Gecko, or Trident
  3. Bare (compatible; ...) block -- classic bot pattern without OS tokens
  4. Known tools -- playwright, selenium, wget, lighthouse, sqlmap, and more

Need more?

If you need deeper user-agent analysis -- device type, OS, browser version, or full bot fingerprinting -- check out cr-ua.

License

Apache-2.0

Project details


Release history Release notifications | RSS feed

This version

1.0.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

is_crawler-1.0.0.tar.gz (8.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

is_crawler-1.0.0-py3-none-any.whl (7.1 kB view details)

Uploaded Python 3

File details

Details for the file is_crawler-1.0.0.tar.gz.

File metadata

  • Download URL: is_crawler-1.0.0.tar.gz
  • Upload date:
  • Size: 8.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for is_crawler-1.0.0.tar.gz
Algorithm Hash digest
SHA256 cec88bd979321166268e5ec907ef0fc659cfc24e709e21254d550d9c6b9b2480
MD5 3fd318fa6e37724f595ad6cbc2d5049e
BLAKE2b-256 ecadfbdec6888527ff0803a694330658e26d2c01def77d417e2448a742d1e562

See more details on using hashes here.

Provenance

The following attestation bundles were made for is_crawler-1.0.0.tar.gz:

Publisher: publish.yml on tn3w/is-crawler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file is_crawler-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: is_crawler-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 7.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for is_crawler-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e3937dacb672c3dbf1da7c7a25f9499c89547bae80deed406188d00335ab6109
MD5 4a3b0cefaa8c8930304189d80dc3728c
BLAKE2b-256 f4afe25ae4ddf936aa74e77adaf3f47777af41e067d9d4b49d7f87ed7ba21d89

See more details on using hashes here.

Provenance

The following attestation bundles were made for is_crawler-1.0.0-py3-none-any.whl:

Publisher: publish.yml on tn3w/is-crawler

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page