Python package to detect bots/crawlers/spiders via user-agent
Project description
is-bot
Python package to detect bots/crawlers/spiders via user-agent string. This is a port of the isbot JavaScript module.
Requirements
- Python >= 3.7
- regex >= 2022.8.17
Installation
pip install is-bot
Usage
Simple usage
from is_bot import Bots
bots = Bots()
ua = 'Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; Googlebot/2.1; +http://www.google.com/bot.html) Chrome/104.0.5112.79 Safari/537.36'
assert bots.is_bot(ua)
ua = 'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/104.0.0.0 Safari/537.36'
assert not bots.is_bot(ua)
Add/remove parsing rules
from is_bot import Bots
bots = Bots()
# Exclude Chrome-Lighthouse from default bot list
ua = 'Mozilla/5.0 (Linux; Android 7.0; Moto G (4)) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4695.0 Mobile Safari/537.36 Chrome-Lighthouse'
assert bots.is_bot(ua)
bots.exclude(['chrome-lighthouse'])
assert not bots.is_bot(ua)
# Add some browser to default bot list
ua = 'SomeAwesomeBrowser/10.0 (Linux; Android 7.0)'
assert not bots.is_bot(ua)
bots.extend(['SomeAwesomeBrowser'])
assert bots.is_bot(ua)
Get additional parsing information
from is_bot import Bots
bots = Bots()
ua = 'Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0 SearchRobot/1.0'
# view the respective match for bot user agent rule
print(bots.find(ua))
#> Search
# list all patterns that match the user agent string
print(bots.matches(ua))
#> ['(?<! (ya|yandex))search', '(?<! cu)bot']
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
is_bot-0.3.0.tar.gz
(9.0 kB
view hashes)
Built Distribution
is_bot-0.3.0-py3-none-any.whl
(8.3 kB
view hashes)