Retrieve statistics for a user's repositories and populate the information onto a GitHub static page
Project description
github-stats-pages
Retrieve statistics for a user's repositories and populate the information onto a GitHub static page
Overview
Installation
Use our PyPI package to get the most stable release:
(venv) $ pip install github-stats-pages
Or if you want the latest version then:
(venv) $ git clone https://github.com/astrochun/github-stats-pages
(venv) $ cd github-stats_pages
(venv) $ python setup.py install
Execution
There are four primary scripts accompanying github-stats-pages
get_repo_list
gts_run_all_repos
merge-csv.sh
make_stats_plots
get_repo_list
generates a CSV file containing a list of public repositories
for a GitHub user/organization. This database allows the code to aggregate
statistics for all repositories. To run, simply use the following command:
(venv) laptop:github_data $ get_repo_list -u <username/organization>
This will generate a CSV file called "<username/organization>.csv".
It is recommended to create a folder (e.g., github_data
) as the contents
will ultimately contain multiple files.
Next, let's gather the statistics for all public repositories that are not
forks. We use another Python library that does this called
github-traffic-stats. It
is accompanied by a python
script called gts
.
To access traffic data, this requires a
Personal Access Token (PAT),
so let's create a PAT. Generate one by going to the
following GitHub page.
For selected scopes you will only need repo
.
Then you can execute the next script:
(venv) laptop:github_data $ API_TOKEN='abcdef12345678'
(venv) laptop:github_data $ gts_run_all_repos -u <username/organization> -t $API_TOKEN -c <username/organization>.csv
This will generate CSV files with date and time stamps prefixes for clones,
traffic, and referrals. With routine running of this code, you will
generate additional CSV files that allow for you to extend beyond a two-week
window of data aggregation. The data can be merged with the merge-csv.sh
script:
(venv) laptop:github_data $ ./merge-csv.sh
This generates three files: merge_clones.csv, merge_traffic.csv and merge_referrers.csv. These files are used in the final step to generate the plots.
Finally to generate static pages containing the visualization, we
use the make_stats_plots
script:
(venv) laptop:github_data $ make_stats_plots -u <username> -c <username>.csv -t $API_TOKEN
This will generate all contents in the local path. Note that you can specify
an output directory with the -o
/--out-dir
option. Default is the current
path.
The resulting folder structure, for example, will be the following:
github_data/
├── data
│ ├── 2021-01-17-00h-46m-clone-stats.csv
│ ├── 2021-01-17-00h-46m-referrer-stats.csv
│ ├── 2021-01-17-00h-46m-traffic-stats.csv
│ ├── ...
│ ├── merged_clone.csv
│ ├── merged_referrer.csv
│ └── merged_traffic.csv
├── repos
│ ├── github-stats-pages.html
│ └── ...
├── styles
| ├── css
| │ └── style.css
| └── js
| ├── bootstrap.min.js
| ├── jquery.min.js
| ├── main.js
| └── popper.js
├── about.html
├── index.html
├── repositories.html
└── <username>.csv
Versioning
Continuous Integration
Authors
- Chun Ly, Ph.D. (@astrochun)
See also the list of contributors who participated in this project.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for github_stats_pages-0.3.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0a3dfe30ce6bc8143eb2d13b232cc090c87d3672c613c127b4c0ed59cf047a27 |
|
MD5 | 877d3acd94fa855cafe14a1f8eda79d8 |
|
BLAKE2b-256 | 7995207a6cacdb3193ac6c1a22ee169d5553eb0d8c38547922bd133c9519ee8f |