Skip to main content

Simple webscraper built on top of requests and beautifulsoup

Project description

Some basic webscraper I use in many projects.

https://img.shields.io/pypi/v/floscraper.svg https://img.shields.io/pypi/l/floscraper.svg https://img.shields.io/pypi/dm/floscraper.svg

webscraper

Module to ease web efforts

Supports

  • Cached web requests (Wrapper around requests)

  • Bultin parsing/scraping (Wrapper around beautifulsoup)

Constructor parameters

  • url: Default url, used if nothing else specified

  • scheme: Default scheme for scrapping

  • timeout

  • cache_directory: Where to save cache files

  • cache_time: How long is a cached resource vaild - in seconds (default: 7 minutes)

  • cache_use_advanced

  • auth_method: Authentication method (default: HTTPBasicAuth)

  • auth_username: Authentication username. If set, enables authentication

  • auth_password: Authentication password

  • handle_redirect: Allow redirects (default: True)

  • user_agent: User agent to use

  • default_user_agents_browser: Browser to set in user agent (from default_user_agents dict)

  • default_user_agents_os: Operating system to set in user agent (from default_user_agents dict)

  • user_agents_browser: Browser to set in user agent (Overwrites default_user_agents_browser)

  • user_agents_os: Operating system to set in user agent (Overwrites default_user_agents_os)

  • html2text: HTML2text settings

  • html_parser: What html parser to use (default: html.parser - built in)

Example

# Setup WebScraper with caching
web = WebScraper({
    'cache_directory': "cache",
    'cache_time': 5*60
})

# First call to git -> hit internet
web.get("https://github.com/")

# Second call to git (within 5 minutes of first) -> hit cache
web.get("https://github.com/")

Whitch results in the following output:

2016-01-07 19:22:00 DEBUG   [WebScraper._getCached] From inet https://github.com
2016-01-07 19:22:00 INFO    [requests.packages.urllib3.connectionpool] Starting new HTTPS connection (1): github.com
2016-01-07 19:22:01 DEBUG   [requests.packages.urllib3.connectionpool] "GET / HTTP/1.1" 200 None
2016-01-07 19:22:01 DEBUG   [WebScraper._getCached] From cache https://github.com

History

0.2.3 (2018-07-02)

  • Upgrade flotils

0.2.2 (2017-12-07)

  • Fix cache duration bug

0.2.1 (2017-11-03)

  • Add raw response to unchached response

0.2.0 (2017-10-12)

  • Rework api names

  • Redesign caching

0.1.15a0 (2016-03-08)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

floscraper-0.2.3.tar.gz (11.9 kB view details)

Uploaded Source

Built Distributions

floscraper-0.2.3-py2.py3-none-any.whl (12.4 kB view details)

Uploaded Python 2 Python 3

floscraper-0.2.3-py2.7.egg (26.6 kB view details)

Uploaded Source

File details

Details for the file floscraper-0.2.3.tar.gz.

File metadata

  • Download URL: floscraper-0.2.3.tar.gz
  • Upload date:
  • Size: 11.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for floscraper-0.2.3.tar.gz
Algorithm Hash digest
SHA256 254b55f5a0ab5792883c928ccefb397bf9f3e04aaa9d44d9db6c64ccb0f8d9b8
MD5 b2d2359a683ab8241f97f70879044027
BLAKE2b-256 9cd25b2445fc8969837ed8e77b1ed3b8b1ca30ec43031a04e18f3631f5ea59af

See more details on using hashes here.

File details

Details for the file floscraper-0.2.3-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for floscraper-0.2.3-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 ef4e1c81e6a95ce440c68c6e2fff7bb6ccf7a7a9477388ad743ef4ce4a095c8d
MD5 d01603eda70e159c4056605101cb0ebf
BLAKE2b-256 0b87eab074448df8891c860d958babdcac18c1fa1ca44ed69b22595b96807dff

See more details on using hashes here.

File details

Details for the file floscraper-0.2.3-py2.7.egg.

File metadata

File hashes

Hashes for floscraper-0.2.3-py2.7.egg
Algorithm Hash digest
SHA256 fd3fefe1bf2fb061a7e7e237a7c42446ae418570692b2b204bdb65993fc83a31
MD5 74c2476fb39fb91fabc199f896bf66ab
BLAKE2b-256 663900fc12074bff5bee72873eb0a934da3fbd8e591fcd931372b397f8cfbebb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page