Skip to main content

Repository mining tool for structuring Git metadata at scale.

Project description

diffhouse: Repository Mining at Scale

PyPI DOI Test status

Documentation

diffhouse is a Python solution for structuring Git metadata, designed to enable large-scale codebase analysis at practical speeds.

Key features are:

  • Fast access to commit data, file changes and more
  • Easy integration with pandas and polars
  • Simple-to-use Python interface

Requirements

Python 3.10 or higher
Git 2.22 or higher

Git also needs to be added to the system PATH.

Limitations

At its core, diffhouse is a data extraction tool and therefore does not calculate software metrics like code churn or cyclomatic complexity; if this is needed, take a look at PyDriller instead.

Also note that revision data is limited to default branches only.

User Guide

This guide aims to cover the basic use cases of diffhouse. For the list of available repository objects and fields, check out the API Reference.

Installation

Install diffhouse through PyPi:

pip install diffhouse

Quickstart

from diffhouse import Repo

url = 'https://github.com/user/repo'

r = Repo(location = url, blobs = True).load()

for c in r.commits:
    print(c.commit_hash[:10], c.committer_date, c.author_email)

print(r.branches)
print(r.diffs[0].to_dict())

First, construct a Repo object and define its target repository via the location argument; this can be either a remote URL or a local path. Pass blobs = True to extract file data as well.

Calling Repo.load() will load all metadata into memory, which can then be accessed through the object's properties. See all properties

blobs = True requires a complete clone of the repository and therefore takes longer to execute. Omit this argument whenever possible.

Lazy Loading

For large repositories, calling load() can be slow and/or take up gigabytes of memory. It is recommended to use the lazy method via with instead:

with Repo(location = url, blobs = True) as r:
    c = list(r.stream_commits())

    for d in r.stream_diffs():
        if d.lines_added == 3:
            break

This brings two big benefits:

  1. Object streaming functions are lazy generators, allowing for efficient memory use.
  2. No processing power is spent on objects that are not explicitly requested.

See all streaming functions

Tabular Data

Commit, ChangedFile and Diff iterables can be passed directly to pandas and polars DataFrame constructors. No pre-processing is needed; table schemas will be inferred correctly.

import polars as pl

df = pl.DataFrame(r.changed_files)
print(df.schema)

diffhouse stores datetime values as ISO 8601 strings to preserve time zone offsets. When converting these to datetime objects in a DataFrame, use the parser's UTC option.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

diffhouse-1.1.1.tar.gz (18.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

diffhouse-1.1.1-py3-none-any.whl (26.7 kB view details)

Uploaded Python 3

File details

Details for the file diffhouse-1.1.1.tar.gz.

File metadata

  • Download URL: diffhouse-1.1.1.tar.gz
  • Upload date:
  • Size: 18.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for diffhouse-1.1.1.tar.gz
Algorithm Hash digest
SHA256 550dc01a5ecb3a41c8602878aa504dc41db429964213cc722d3eefbe1296626b
MD5 439f4f123f3ee08efa7dbd2dd11ae39a
BLAKE2b-256 df159d08110f11fe9b4f55f1e72bcaf88320a1abd8a482e1cefb1f9d6ad13f19

See more details on using hashes here.

File details

Details for the file diffhouse-1.1.1-py3-none-any.whl.

File metadata

  • Download URL: diffhouse-1.1.1-py3-none-any.whl
  • Upload date:
  • Size: 26.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for diffhouse-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 64134921ac7c62c91b15004deb05da4ce966c144e141c190ee0cda1ee772d509
MD5 c076611939e89210b5cdea657b51d32b
BLAKE2b-256 4883acf05b1524bb2553ecf8a232b0762b518be4fd4385975e37a899ea56ca6d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page