Skip to main content

Data curation engine for LLM fine-tuning

Project description

Truva

Truva curates your fine-tuning data so you train on signal, not noise.

A CLI-first data curation engine for ML engineers who fine-tune language models. Truva takes a messy dataset and produces a smaller, higher-quality "gold" dataset by removing redundancy, scoring information density, and detecting contradictions.

Goal: Reduce dataset size by 50–80% while maintaining or improving downstream model accuracy, cutting GPU training costs proportionally.

Quick Install

pip install truva

30-Second Example

# Deduplicate a dataset with default settings
truva dedupe ./data.jsonl --output ./deduped.jsonl

# Deduplicate with a custom threshold and generate a report
truva dedupe ./data.jsonl --threshold 0.9 --output ./deduped.jsonl --report ./report.json

# Generate embeddings for a dataset
truva embed ./data.jsonl --output ./embeddings.npy

What It Does

Before After
50,000 rows 12,000 rows
Redundant examples Unique, representative samples
Unknown quality Scored and filtered
Hidden contradictions Flagged for review

Features

Semantic Deduplication

Removes near-duplicate rows using embedding similarity and Union-Find clustering. Each cluster keeps the single most representative example (closest to centroid).

truva dedupe ./data.jsonl --threshold 0.95
  • --threshold 0.95 (default): Aggressive but safe for most fine-tuning datasets
  • --threshold 0.85: More aggressive, catches paraphrases
  • --threshold 1.0: Only removes exact semantic matches

Embedding Generation

Compute vector embeddings for your dataset using local models or the OpenAI API.

# Local (free, no API key needed)
truva embed ./data.jsonl --provider local --model all-MiniLM-L6-v2

# OpenAI API
truva embed ./data.jsonl --provider api --model text-embedding-3-small

Supported Formats

  • JSONL — One JSON object per line (.jsonl, .json)
  • CSV — Auto-detects the text column or use --text-field
  • Hugging Face Datasets — Pass a dataset identifier like username/dataset

Configuration

All options are available as CLI flags:

--threshold FLOAT       Cosine similarity threshold for dedup (0.0–1.0)
--provider [local|api]  Embedding provider
--model TEXT            Model name
--text-field TEXT       Column/field to use (auto-detected if not set)
--format TEXT           Input format: auto, jsonl, csv, hf
--output, -o TEXT       Output file path
--report TEXT           Path for JSON report

Requirements

  • Python 3.10+
  • Works on macOS (Apple Silicon), Linux

License

Apache 2.0 — see LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

truva-0.1.1.tar.gz (18.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

truva-0.1.1-py3-none-any.whl (20.8 kB view details)

Uploaded Python 3

File details

Details for the file truva-0.1.1.tar.gz.

File metadata

  • Download URL: truva-0.1.1.tar.gz
  • Upload date:
  • Size: 18.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for truva-0.1.1.tar.gz
Algorithm Hash digest
SHA256 2132ca14a94131e98a764f385867826b317e587351a49779c3e6f8ee37bba4a4
MD5 8dbd109b2df65f12e6cf9eef3510573b
BLAKE2b-256 8bc99f537b27da43fabc38c8b5753e1493c4b462f5590233647407165b896c44

See more details on using hashes here.

File details

Details for the file truva-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: truva-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 20.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.13

File hashes

Hashes for truva-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 91cd670f178d44882fbe17cf1a4dfef625cd8007d08e5818fb2dcd645a0810b5
MD5 aff9134c256185d358f16fb4f120b2d9
BLAKE2b-256 25086b1da4825c1c3ca4fc482a0023beb112f117060d1afe280a09c8ede4b8d0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page