Skip to main content

Retrieve statistics for a user's repositories and populate the information onto a GitHub static page

Project description

github-stats-pages

Retrieve statistics for a user's repositories and populate the information onto a GitHub static page

GitHub Workflow Status (main) GitHub Workflow Status PyPI - Python Version PyPI

PyPI - Downloads

Overview

Installation

Use our PyPI package to get the most stable release:

(venv) $ pip install github-stats-pages

Or if you want the latest version then:

(venv) $ git clone https://github.com/astrochun/github-stats-pages
(venv) $ cd github-stats_pages
(venv) $ python setup.py install

Execution

There are four primary scripts accompanying github-stats-pages

  1. get_repo_list
  2. gts_run_all_repos
  3. merge-csv.sh
  4. make_stats_plots

get_repo_list generates a CSV file containing a list of public repositories for a GitHub user/organization. This database allows the code to aggregate statistics for all repositories. To run, simply use the following command:

(venv) laptop:github_data $ get_repo_list -u <username/organization>

This will generate a CSV file called "<username/organization>.csv". It is recommended to create a folder (e.g., github_data) as the contents will ultimately contain multiple files.

Next, let's gather the statistics for all public repositories that are not forks. We use another Python library that does this called github-traffic-stats. It is accompanied by a python script called gts.

To access traffic data, this requires a Personal Access Token (PAT), so let's create a PAT. Generate one by going to the following GitHub page. For selected scopes you will only need repo.

Then you can execute the next script:

(venv) laptop:github_data $ API_TOKEN='abcdef12345678'
(venv) laptop:github_data $ gts_run_all_repos -u <username/organization> -t $API_TOKEN -c <username/organization>.csv

This will generate CSV files with date and time stamps prefixes for clones, traffic, and referrals. With routine running of this code, you will generate additional CSV files that allow for you to extend beyond a two-week window of data aggregation. The data can be merged with the merge-csv.sh script:

(venv) laptop:github_data $ ./merge-csv.sh

This generates three files: merge_clones.csv, merge_traffic.csv and merge_referrers.csv. These files are used in the final step to generate the plots.

Finally to generate static pages containing the visualization, we use the make_stats_plots script:

(venv) laptop:github_data $ make_stats_plots -u <username> -c <username>.csv -t $API_TOKEN

This will generate all contents in the local path. Note that you can specify an output directory with the -o/--out-dir option. Default is the current path.

The resulting folder structure, for example, will be the following:

github_data/
├── data
│   ├── 2021-01-17-00h-46m-clone-stats.csv
│   ├── 2021-01-17-00h-46m-referrer-stats.csv
│   ├── 2021-01-17-00h-46m-traffic-stats.csv
│   ├── ...
│   ├── merged_clone.csv
│   ├── merged_referrer.csv
│   └── merged_traffic.csv
├── repos
│   ├── github-stats-pages.html
│   └── ...
├── styles
|   ├── css
|   │   └── style.css
|   └── js
|       ├── bootstrap.min.js
|       ├── jquery.min.js
|       ├── main.js
|       └── popper.js
├── about.html
├── index.html
├── repositories.html
└── <username>.csv

Versioning

Continuous Integration

Authors

See also the list of contributors who participated in this project.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

github-stats-pages-0.3.0.tar.gz (99.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

github_stats_pages-0.3.0-py3-none-any.whl (106.6 kB view details)

Uploaded Python 3

File details

Details for the file github-stats-pages-0.3.0.tar.gz.

File metadata

  • Download URL: github-stats-pages-0.3.0.tar.gz
  • Upload date:
  • Size: 99.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.9.2

File hashes

Hashes for github-stats-pages-0.3.0.tar.gz
Algorithm Hash digest
SHA256 52d6504dc531491f2a13b50a832d92b706f4ad81baa3b9d1cc28b0eee7d04e38
MD5 c3bacc26e6d7266e0b94e7132ed5288f
BLAKE2b-256 86419aaeb987231fdb211fafe0cc1a86a880d0c23ac7877990130cbcb8949dab

See more details on using hashes here.

File details

Details for the file github_stats_pages-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: github_stats_pages-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 106.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.7.3 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.9.2

File hashes

Hashes for github_stats_pages-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0a3dfe30ce6bc8143eb2d13b232cc090c87d3672c613c127b4c0ed59cf047a27
MD5 877d3acd94fa855cafe14a1f8eda79d8
BLAKE2b-256 7995207a6cacdb3193ac6c1a22ee169d5553eb0d8c38547922bd133c9519ee8f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page