Skip to main content

Repository mining tool for structuring Git metadata at scale.

Project description

diffhouse: Repository Mining at Scale

PyPI DOI Test status

Documentation

diffhouse is a Python solution for structuring Git metadata, designed to enable large-scale codebase analysis at practical speeds.

Key features are:

  • Fast access to commit data, file changes and more
  • Easy integration with pandas and polars
  • Simple-to-use Python interface

Requirements

Python 3.10 or higher
Git 2.22 or higher

Git also needs to be added to the system PATH.

Limitations

At its core, diffhouse is a data extraction tool and therefore does not calculate software metrics like code churn or cyclomatic complexity; if this is needed, take a look at PyDriller instead.

Also note that revision data is limited to default branches only.

User Guide

This guide aims to cover the basic use cases of diffhouse. For the list of available repository objects and fields, check out the API Reference.

Installation

Install diffhouse through PyPi:

pip install diffhouse

Quickstart

from diffhouse import Repo

url = 'https://github.com/user/repo'

r = Repo(location = url, blobs = True).load()

for c in r.commits:
    print(c.commit_hash[:10], c.committer_date, c.author_email)

print(r.branches)
print(r.diffs[0].to_dict())

First, construct a Repo object and define its target repository via the location argument; this can be either a remote URL or a local path. Pass blobs = True to extract file data as well.

Calling Repo.load() will load all metadata into memory, which can then be accessed through the object's properties. See all properties

blobs = True requires a complete clone of the repository and therefore takes longer to execute. Omit this argument whenever possible.

Lazy Loading

For large repositories, calling load() can be slow and/or take up gigabytes of memory. It is recommended to use the lazy method via with instead:

with Repo(location = url, blobs = True) as r:
    c = list(r.stream_commits())

    for d in r.stream_diffs():
        if d.lines_added == 3:
            break

This brings two big benefits:

  1. Object streaming functions are lazy generators, allowing for efficient memory use.
  2. No processing power is spent on objects that are not explicitly requested.

See all streaming functions

Tabular Data

Commit, ChangedFile and Diff iterables can be passed directly to pandas and polars DataFrame constructors. No pre-processing is needed; table schemas will be inferred correctly.

import polars as pl

df = pl.DataFrame(r.changed_files)
print(df.schema)

diffhouse stores datetime values as ISO 8601 strings to preserve time zone offsets. When converting these to datetime objects in a DataFrame, use the parser's UTC option.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

diffhouse-1.1.0.tar.gz (18.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

diffhouse-1.1.0-py3-none-any.whl (26.7 kB view details)

Uploaded Python 3

File details

Details for the file diffhouse-1.1.0.tar.gz.

File metadata

  • Download URL: diffhouse-1.1.0.tar.gz
  • Upload date:
  • Size: 18.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for diffhouse-1.1.0.tar.gz
Algorithm Hash digest
SHA256 08641cb424b2d1d6526713738d029b05e2e00d102ebb71ef809d6ff6cefa2e05
MD5 e43ee16aa2ed1b39592bb95c112f1918
BLAKE2b-256 40777e1efadc9e9c4b8d88b6db8caf395682400c8ed37154639dd1a9114ae5e4

See more details on using hashes here.

File details

Details for the file diffhouse-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: diffhouse-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 26.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for diffhouse-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ccaeec12ad25325887dfa4730e348c3a7e2b6c7e3d6017412d6212cc7c14c302
MD5 26b5432bb8d69d7307e9698232f75be1
BLAKE2b-256 b2d67253de27ee2638d623c10e1ad794a8cd4547fa1dce5e527d992d8e6c4522

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page