Skip to main content

Extract data from APD news site

Project description

https://badge.fury.io/py/scrapd.svg https://circleci.com/gh/rgreinho/scrapd.svg?style=svg https://coveralls.io/repos/github/rgreinho/scrapd/badge.svg?branch=master

Extract data from APD news site.

ScrAPD is a small utility designed to help organizations retrieving traffic fatality data in a friendly manner.

Installation

ScrAPD requires Python 3.7+ to work.

pip install scrapd

Quickstart

Collect all the data as CSV:

scrapd retrieve --format csv

By default, scrapd does not display anything until it is done collecting the data. If you want to get some feedback about the process, you can enable logging, by adding the -v BEFORE the command you want to use. Multiple -v options increase the verbosity. The maximum is 3 (-vvv):

scrapd -v retrieve --format csv

To save the results to a file, use the shell redirection:

scrapd -v retrieve --format csv > results.csv

Examples

Retrieve the traffic fatalities that happened between January 15th 2019 and January 18th 2019, and output the results in json:

scrapd retrieve --from "Jan 15 2019" --to "Jan 18 2019" --format json

[
  {
    "Age": 31,
    "Case": "19-0150158",
    "DOB": "07/09/1987",
    "Date": "January 15, 2019",
    "Ethnicity": "White",
    "Fatal crashes this year": "1",
    "First Name": "Hilburn",
    "Gender": "male",
    "Last Name": "Sell",
    "Link": "http://austintexas.gov/news/traffic-fatality-1-4",
    "Location": "10500 block of N IH 35 SB",
    "Time": "6:20 a.m."
  },
  {
    "Age": 58,
    "Case": "19-0161105",
    "DOB": "02/15/1960",
    "Date": "January 16, 2019",
    "Ethnicity": "White",
    "Fatal crashes this year": "2",
    "First Name": "Ann",
    "Gender": "female",
    "Last Name": "Bottenfield-Seago",
    "Link": "http://austintexas.gov/news/traffic-fatality-2-3",
    "Location": "West William Cannon Drive and Ridge Oak Road",
    "Time": "3:42 p.m."
  }
]

Do the same research but output as CSV:

scrapd retrieve --from "Jan 15 2019" --to "Jan 18 2019" --format csv


Fatal crashes this year,Case,Date,Time,Location,First Name,Last Name,Ethnicity,Gender,DOB,Age,Link
1,19-0150158,"January 15, 2019",6:20 a.m.,10500 block of N IH 35 SB,Hilburn,Sell,White,male,07/09/1987,31,http://austintexas.gov/news/traffic-fatality-1-4
2,19-0161105,"January 16, 2019",3:42 p.m.,West William Cannon Drive and Ridge Oak Road,Ann,Bottenfield-Seago,White,female,02/15/1960,58,http://austintexas.gov/news/traffic-fatality-2-3

Retrieve all the traffic fatalities from 2019 (as of Jan 20th 2019) in json, and enabling the logging to follow the progress of the process:

scrapd -v retrieve --from "1 1 2019" --format json

Fetching page 1...
Fetching page 2...
Total: 2
[
  {
    "Age": 31,
    "Case": "19-0150158",
    "DOB": "07/09/1987",
    "Date": "January 15, 2019",
    "Ethnicity": "White",
    "Fatal crashes this year": "1",
    "First Name": "Hilburn",
    "Gender": "male",
    "Last Name": "Sell",
    "Link": "http://austintexas.gov/news/traffic-fatality-1-4",
    "Location": "10500 block of N IH 35 SB",
    "Time": "6:20 a.m."
  },
  {
    "Age": 58,
    "Case": "19-0161105",
    "DOB": "02/15/1960",
    "Date": "January 16, 2019",
    "Ethnicity": "White",
    "Fatal crashes this year": "2",
    "First Name": "Ann",
    "Gender": "female",
    "Last Name": "Bottenfield-Seago",
    "Link": "http://austintexas.gov/news/traffic-fatality-2-3",
    "Location": "West William Cannon Drive and Ridge Oak Road",
    "Time": "3:42 p.m."
  }
]

Export the results to Google Sheets:

scrapd -v retrieve \
  --from "Feb 1 2019" \
  --format gsheets \
  --gcredentials creds.json \
  --gcontributors "remy.greinhofer@gmail.com:user:writer"

Speed and accuracy

ScrAPD executes all the requests in an asynchronous manner. As a result it goes very fast.

It parses the information using both the text of the report itself and the Twitter tweet stored in the page metadata. Combining these two methods provides a high degree of confidence in the parsing and allows us to reach 90% of success rate.

Some statistics:

  • 125 entries in total

  • 112 entries correctly parsed (90%)

    • 105 entries fully parsed (85%)

    • 7 entries where the fatalities were unidentified or had no info (5%)

  • 7 entries failed the parsing (bug or incorrect regex)(5%)

  • 6 entries were using natural language instead of field-like organization (5%)

    • i.e. “54 years of age” or “42 years old instead” of “DOB: 01/02/1972”

  • processing time: ~1m40s

Who uses ScrAPD?

The Austin Pedestrian Advisory Council used ScrAPD to compile a detailed presentation of the status of the traffic deaths in Austin, TX:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

scrapd-1.2.0-py3-none-any.whl (17.2 kB view details)

Uploaded Python 3

File details

Details for the file scrapd-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: scrapd-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 17.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.0 CPython/3.7.2

File hashes

Hashes for scrapd-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1963a17bf34dcf157988c50d36e8525f5299c45e4fd84bc36d4907fa0c58f357
MD5 3487826b1f5eee6ede6b6492a8db7d33
BLAKE2b-256 929410551811e20291e7149009f7091e0bb3f118461593327472006b44497c3f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page