Skip to main content

Repository mining tool for structuring Git metadata at scale.

Project description

diffhouse: Repository Mining at Scale

PyPI DOI Test status

Documentation

diffhouse is a Python solution for structuring Git metadata, designed to enable large-scale codebase analysis at practical speeds.

Key features are:

  • Fast access to commit data, file changes and more
  • Easy integration with pandas and polars
  • Simple-to-use Python interface

Requirements

Python Git
3.10 or higher 2.22 or higher

Git also needs to be available in the system PATH.

Limitations

At its core, diffhouse is a data extraction tool and therefore does not calculate software metrics like code churn or cyclomatic complexity; if this is needed, take a look at PyDriller instead.

Also note that revision data is limited to default branches only.

User Guide

This guide aims to cover the basic use cases of diffhouse. For the list of available repository objects and fields, check out the API Reference.

Installation

Install diffhouse through PyPi:

pip install diffhouse

Quickstart

from diffhouse import Repo

url = 'https://github.com/user/repo'

r = Repo(location = url, blobs = True).load()

for c in r.commits:
    print(c.commit_hash[:10], c.committer_date, c.author_email)

print(r.branches)
print(r.diffs[0].to_dict())

First, construct a Repo object and define its target repository via the location argument; this can be either a remote URL or a local path. Pass blobs = True to extract file data as well.

Calling Repo.load() will load all metadata into memory, which can then be accessed through the object's properties. See all properties

blobs = True requires a complete clone of the repository and therefore takes longer to execute. Omit this argument whenever possible.

Lazy Loading

For large repositories, calling load() can be slow and/or take up gigabytes of memory. It is recommended to use the lazy method via with instead:

with Repo(location = url, blobs = True) as r:
    c = list(r.stream_commits())

    for d in r.stream_diffs():
        if d.lines_added == 3:
            break

This brings two big benefits:

  1. Object streaming functions are lazy generators, allowing for efficient memory use.
  2. No processing power is spent on objects that are not explicitly requested.

See all streaming functions

Tabular Data

Commit, ChangedFile and Diff iterables can be passed directly to pandas and polars DataFrame constructors. No pre-processing is needed; table schemas will be inferred correctly.

import polars as pl

df = pl.DataFrame(r.changed_files)
print(df.schema)

diffhouse stores datetime values as ISO 8601 strings to preserve time zone offsets. When converting these to datetime objects in a DataFrame, use the parser's UTC option.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

diffhouse-1.0.2.tar.gz (18.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

diffhouse-1.0.2-py3-none-any.whl (26.3 kB view details)

Uploaded Python 3

File details

Details for the file diffhouse-1.0.2.tar.gz.

File metadata

  • Download URL: diffhouse-1.0.2.tar.gz
  • Upload date:
  • Size: 18.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for diffhouse-1.0.2.tar.gz
Algorithm Hash digest
SHA256 1079c3d9c52249bf2ffcab21739dd5f08043b6759addfad892d6c0d33d7edc22
MD5 a137c6f311cb0f79b9eafb64d37e613b
BLAKE2b-256 4c3abe3dbf0f7017a1238c3d6fd0f740038a27c1671b3b8b48145e8cb6a4b40a

See more details on using hashes here.

File details

Details for the file diffhouse-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: diffhouse-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 26.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for diffhouse-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 65105108e28641fab099f1082d3f97f5e0d7c83ea9b5ef291309257b321647ce
MD5 9b81fd3027df13f0d5d800888a3f7a19
BLAKE2b-256 7eedce5c00224764a9d0b69681d4fc7f0b4f00833e4fad434c0438f916d55d5c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page