Skip to main content

Script to parse MOM's website for report and stats updates

Project description

mom_scrape

example usage:

from mom_scrape.scrapers import ReportScraper, WebScraper

reportscraper=ReportScraper(filter_by='Reports',save_dir='stats/reports')
results=reportscraper.get_info()

This first creates a dates folder then dump the dates of the various repoerts in the mom_reports.json file. Then, the new PDFs (these are PDFs that are not already recorded in mom_reports.json file) are downloaded under save_dir

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mom_scrape-0.0.5.tar.gz (4.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mom_scrape-0.0.5-py3-none-any.whl (4.6 kB view details)

Uploaded Python 3

File details

Details for the file mom_scrape-0.0.5.tar.gz.

File metadata

  • Download URL: mom_scrape-0.0.5.tar.gz
  • Upload date:
  • Size: 4.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for mom_scrape-0.0.5.tar.gz
Algorithm Hash digest
SHA256 333022ca2962e1d0e614a386d2e6dcd324c8165d807a4cf9ab4cd3b1134d1409
MD5 f83c28b08797e725bd1b4d1bb9d31567
BLAKE2b-256 7d5e89da9a141d996b0b334a9bc9dd7fe316670a620561e1fb05ab980c7f3fad

See more details on using hashes here.

File details

Details for the file mom_scrape-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: mom_scrape-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 4.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for mom_scrape-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 ec8a3cfea26660024140c7c287ea6cecfe2a78dd6383259496a72524a0fa4a0e
MD5 a1aa71af2b841abe350f7adf011c9878
BLAKE2b-256 005487dd0d7c0c9f5727b27c7fb9c2dfdbaa1cb1e8579e0b6b064b941eb99f46

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page