Skip to main content

Team Viewer - Data Extract for analysis

Project description

Build Status Coverage Status PyPI - Version PyPI - Wheel

Team View - Extract (Data extraction for team analysis)

Extract team analysis data from git (current) and other relevant sources (future).

Pre-requisites for Use

  1. Tokei (This fork has initial, rudementary, support for Jupyter notebooks). Tokei is used to collect metrics on volume of source based per source language.

If you don't have rust installed and aren't currently using Tokei, We recommend utilizing the team-view-extract docker image.

Setup

A configuration file that defines the "project" for extract must be defined. The configuration file is json.

Example:

{
  "extracts": [
    {
      "name": "Project 1",
      "repos": [
        {
          "name": "TeamView",
          "remote": "git@github.com:rappdw/TeamViewer.git"
        },
        {
          "name": "team-view-extract",
          "remote": "git@github.com:rappdw/team-viewer-extract.git"
        }
      ],
      "start_date": "2018-07-18",
      "end_date": "2018-08-31"
    }
  ],
  "output_path": "~/.local/share/cache/TeamView",
  "mailmap_file": "~/.local/share/cache/.mailmap",
  "logging": 20
}

Multiple extracts can be defined in a singl configuration file. start_date, end_date, mailmap_file and logging are all optional. logging defaults to info level. start_date defaults to beginning of project. end_date defaults to today. If no mailmap_file is specified, standard git configuration applies.

A good way to create the mailmap file is to construct based on git shortlog -sne for each repository.

Results

For each extract specified in the configuration file, a sub-directory will be created in the directory spcified by output_path. The following files will be created:

  • author_totals.csv - Commit counts by author and repository (excluding merge commits)
  • loc.csv - File counts by language, commit and repository (commits to master branch only)
  • loc_delta.csv - File Counts by author, language, commit and repository (excluding merge commits)
  • prs.csv - Pull request by repo including duration between last commit to branch and merge to master
  • repo.csv - Current state of volume of code by language for each repo
  • revs.csv - Revision graph by repo

'Temporary' Files

~/.local/share/cache is used to cache temporary files including checkout of repos specified by extract and a cache of the LOC revision history of each repo. If present this cache is updated on subsequent runs. If not present it is recreated from scratch.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tv_extract-1.0.15-py3-none-any.whl (18.4 kB view details)

Uploaded Python 3

File details

Details for the file tv_extract-1.0.15-py3-none-any.whl.

File metadata

  • Download URL: tv_extract-1.0.15-py3-none-any.whl
  • Upload date:
  • Size: 18.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.3

File hashes

Hashes for tv_extract-1.0.15-py3-none-any.whl
Algorithm Hash digest
SHA256 ac7fd5bdaadc9a0f0e15ff189ef54e4cf58796f51fccca37e2c8e57ee0bb5803
MD5 48017b36eaed9ea50eb476ffcaa99796
BLAKE2b-256 d6df648b02f326d634ebb3eabd89c4fb8f755590782ed1befe0c11373c842f72

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page