Skip to main content

The TopDownHockey Scraper

Project description

TopDownHockey EliteProspects Scraper

By Patrick Bacon, made possible by the work of Marcus Sjölin and Harry Shomer.


This is a package built for scraping two data sources:

  1. The NHL's Play-by-Play Reports, which come in the form of HTML/API reports from the NHL and XML reports from ESPN.

  2. Elite Prospects, an extremely valuable website which makes hockey data for thousands of leagues available to the public.

This package is strictly built for end users who wish to scrape data for personal use. If you are interested in using Elite Prospects data for professional purposes, I recommend you look into the Elite Prospects API.

While using the scraper, please be mindful of EliteProspects servers.

Installation


You can install the package by entering the following command in terminal:

pip install TopDownHockey_Scraper

If you're interested in using the NHL Play-By-Play scraper, import that module using this function in Python:

import TopDownHockey_Scraper.TopDownHockey_NHL_Scraper as tdhnhlscrape

If you're interested in using the Elite Prospects scraper, import that module using this function in Python:

import TopDownHockey_Scraper.TopDownHockey_EliteProspects_Scraper as tdhepscrape

User-End Functions (NHL Scraper)


scrape_schedule(start_date, end_date)

Returns the NHL's schedule from the API for all games played between a start date and an end date.

  • start_date: The first date in the list of game dates that you would like to scrape. Enter as a string in "YYYY-MM-DD" format.
  • end_date: The last date in the list of game dates that you would like to scrape. Enter as a string in "YYYY-MM-DD" format.

Example:

tdhnhlscrape.scrape_schedule("2021-01-01", "2021-05-20")


full_scrape(game_id_list, shift = True)

Returns a dataframe containing play-by-play data for a list of game ids.

  • game_id_list: A list of NHL game ids.
  • shift: Shift the coordinate source to ESPN. By default, the program will attempt to scrape the NHL's API for location coordinates first.

Example:

tdhnhlscrape.full_scrape([2020020014, 2020020015, 2020020016])

Combine the two functions and scrape the entire 2021 regular season:

  • schedule_2021 = tdhnhlscrape.scrape_schedule("2021-01-01", "2021-05-20")
  • schedule_2021 = schedule_2021[schedule_2021.type=='R']
  • game_list_2021 = list(schedule_2021.ID)
  • pbp_2021 = tdhnhlscrape.full_scrape(game_list_2021)

User-End Functions (Elite Prospects Scraper)


get_skaters(leagues, seasons)

Returns a dataframe containing statistics for all skaters in a target set of league(s) and season(s).

  • leagues: One or multiple leagues. If one league, enter as a string i.e; "nhl". If multiple leagues, enter as a tuple or list i.e; ("nhl", "ahl").
  • seasons: One or multiple leagues. If one league, enter as a string i.e; "2018-2019". If multiple leagues, enter as a tuple or list i.e; ("2018-2019", "2019-2020").

Example:

tdhepscrape.get_skaters(("nhl", "ahl"), ("2018-2019", "2019-2020"))


get_goalies(leagues, seasons)

Returns a dataframe containing statistics for all goalies in a target set of league(s) and season(s).

  • leagues: One or multiple leagues. If one league, enter as a string i.e; "nhl". If multiple leagues, enter as a tuple or list i.e; ("nhl", "ahl").
  • seasons: One or multiple leagues. If one league, enter as a string i.e; "2018-2019". If multiple leagues, enter as a tuple or list i.e; ("2018-2019", "2019-2020").

Example:

tdhepscrape.get_goalies("khl", "2015-2016")


get_player_information(dataframe)

Returns a dataframe containing bio information for all skaters or goalies (or both) within a target dataframe.

  • dataframe: The dataframe returned by one of the previous two commands.

Example:

Say you obtain skater data for the KHL in 2020-2021 and store that as a dataframe called output. You can run this function to get bio information for every player in that league's scrape.

output = tdhepscrape.get_skaters("khl", "2020-2021")

tdhepscrape.get_player_information(output)


add_player_information(dataframe)

Returns a dataframe containing bio information for all skaters or goalies (or both) within a target dataframe as well as the statistics from the original dataframe.

  • dataframe: The dataframe returned by one of the previous two commands.

Example:

Say you obtain skater data for the KHL in 2020-2021 and store that as a dataframe called output. You can run this function to get bio information for every player in that league's scrape.

output = tdhepscrape.get_skaters("khl", "2020-2021")

tdhepscrape.add_player_information(output)

Comments, Questions, and Concerns.


My goal was to make this package as error-proof as possible. I believe I've accounted for every issue that could potentially throw off a scrape, but it's possible I've missed something.

If any issues arise, or you have any questions about the package, please do not hesitate to contact me on Twitter at @TopDownHockey or email me directly at patrick.s.bacon@gmail.com.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

TopDownHockey_Scraper-2.0.11.tar.gz (37.2 kB view details)

Uploaded Source

Built Distribution

TopDownHockey_Scraper-2.0.11-py3-none-any.whl (35.8 kB view details)

Uploaded Python 3

File details

Details for the file TopDownHockey_Scraper-2.0.11.tar.gz.

File metadata

  • Download URL: TopDownHockey_Scraper-2.0.11.tar.gz
  • Upload date:
  • Size: 37.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.9.4

File hashes

Hashes for TopDownHockey_Scraper-2.0.11.tar.gz
Algorithm Hash digest
SHA256 b5733baf02f881711400b4eb8737f31ce76109a4fda476f7518ec8dd995fba6b
MD5 2afc2f376ba86b0de3c3e7e50973daa4
BLAKE2b-256 3886a425169fb329052708969ba5cce478f47e7d3ea50339790d76e383914e63

See more details on using hashes here.

File details

Details for the file TopDownHockey_Scraper-2.0.11-py3-none-any.whl.

File metadata

  • Download URL: TopDownHockey_Scraper-2.0.11-py3-none-any.whl
  • Upload date:
  • Size: 35.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.9.4

File hashes

Hashes for TopDownHockey_Scraper-2.0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 d98f25691166b5bf6f3c4c22f815bf073cdf1d52b8517595f3e7dc83a01a4e73
MD5 8fbcd00b30f251334ca087210d08baae
BLAKE2b-256 ab25c4f4be6b71ae3cb2b1c7c399e9c12dc435ecdbd6d2ccad64b136b7db3677

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page