Skip to main content

An agenda scraper framework for municipalities

Project description

Engage Scraper

Installation

pip i engage-scraper

About

The Engage Scraper is a standalone library that can be included in any service. The purpose of the scraper is to catalog a municipality's council meeting agendas in a usable format for such things as the engage-client and engage-backend.

To extend this library for your municipality, override the methods of the base class from the scraper_core/ directory and put it in scraper_logics/, prefacing it with your municipality name. For an example see the Santa Monica, CA example in the scraper_logics/ directory. The Santa Monica example makes use of htmlutils.py because it requires HTML scraping for its sources. Feel free to make PRs with new utilities (for example, PDF scraping, RSS scraping, JSON parsing, etc.). The Santa Monica example also uses SQLAlchemy for its models and that is what is preferred for use in the dbutils.py, however you can use anything. ORMs are preferred rather than vanilla psycopg2 or the like.

To use the postgres dbutils.py make sure to set these 5 environment variables (check dev.env and see docker-compose usage below):

  • POSTGRES_HOST optional a host or hostname that is resolvable. Defaults to localhost
  • POSTGRES_USER required
  • POSTGRES_PASSWORD required
  • POSTGRES_PORT optional defaults to 5432
  • POSTGRES_DB required The database used for cataloging your municipality's agendas.

An example of using the Santa Monica scraper library

from engage_scraper.scraper_logics import santamonica_scraper_logic

scraper = santamonica_scraper_logic.SantaMonicaScraper(committee="Santa Monica City Council")
scraper.get_available_agendas()
scraper.scrape()

For SantaMonicaScraper instantiation

For twitter utils used in SantaMonicaScraer

To use the santa monica logic, you must create an App on twitter (will work to make this optional). Following making an app, please use the structure dev.env file to insert the appropriate parameters. But make sure not to make changes to the repository's file. Copy the file up one directory and edit it there. Following the edit, use the docker-compose.yml for testing. You can add examples to examples/ and run them from the script in scripts/ using the docker container.

For the SantaMonicaScraper class the init has these options:

  • tz_string="America/Los_Angeles" # defaulted string
  • years=["2019"] # defaulted array of strings of years
  • committee="Santa Monica City Council" # defaulted string of council name

The exposed API methods for scraper are

  • .get_available_agendas() # To get available agendas, no arguments
  • .scrape() # To process agendas and store contents

Feel free to expose more

  • Write wrappers for internal functions if you want to expose them
  • Write extra functions to handle more complex municipality-specific tasks

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

engage_scraper-0.0.28.tar.gz (12.4 kB view details)

Uploaded Source

Built Distribution

engage_scraper-0.0.28-py3-none-any.whl (15.0 kB view details)

Uploaded Python 3

File details

Details for the file engage_scraper-0.0.28.tar.gz.

File metadata

  • Download URL: engage_scraper-0.0.28.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.33.0 CPython/3.7.4

File hashes

Hashes for engage_scraper-0.0.28.tar.gz
Algorithm Hash digest
SHA256 15b9719114e3e36a59a5e66b4b2109d54a098fc82ae307cb8ce0c92fc341eb13
MD5 a0cc9f0a4f9c88e3a699115d44140042
BLAKE2b-256 25ee2c6536c93c2d7176c4c4be368eeec67b8edebc1a2f44b1db260a95c46e82

See more details on using hashes here.

File details

Details for the file engage_scraper-0.0.28-py3-none-any.whl.

File metadata

  • Download URL: engage_scraper-0.0.28-py3-none-any.whl
  • Upload date:
  • Size: 15.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.33.0 CPython/3.7.4

File hashes

Hashes for engage_scraper-0.0.28-py3-none-any.whl
Algorithm Hash digest
SHA256 3006e131b50445241e38dac924d28f71d1a6efaad53269222a21c47829e6fd6c
MD5 4cf4aea283e6e3796b531c5398b535c0
BLAKE2b-256 5a270059d9c1895436cab7b07ebd28cbc9999c9d4d8aba2c49624f5d671ee2ba

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page