Skip to main content

Script to parse MOM's website for report and stats updates

Project description

mom_scrape

example usage:

from mom_scrape.scrapers import ReportScraper, WebScraper

reportscraper=ReportScraper(filter_by='Reports',save_dir='stats/reports')
results=reportscraper.get_info()

This first creates a dates folder then dump the dates of the various repoerts in the mom_reports.json file. Then, the new PDFs (these are PDFs that are not already recorded in mom_reports.json file) are downloaded under save_dir

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mom_scrape-0.0.4.tar.gz (4.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mom_scrape-0.0.4-py3-none-any.whl (4.6 kB view details)

Uploaded Python 3

File details

Details for the file mom_scrape-0.0.4.tar.gz.

File metadata

  • Download URL: mom_scrape-0.0.4.tar.gz
  • Upload date:
  • Size: 4.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for mom_scrape-0.0.4.tar.gz
Algorithm Hash digest
SHA256 9766757b8a7bf9eaa0c97f3c79f79b91eeb3cee867adb60b39031dd43c229f1f
MD5 0d40f9d8df3f6aac7939ed3bf39f5706
BLAKE2b-256 b2e49553ad954aa7c70d6d9cd460fae6874e27e72d58497abac1ea6363ca84ed

See more details on using hashes here.

File details

Details for the file mom_scrape-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: mom_scrape-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 4.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for mom_scrape-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 51efe3c78905c5060a72666b25c52b3b73f28c09ba478e48b82bd792a6d0ecc0
MD5 06408007700bc585159f4bcd11dcfa12
BLAKE2b-256 982c16439a6d5d03d09c3c655e5a81800c0da2706fb75ff0e4945dfaec3cc877

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page