Skip to main content

Git for data scientists - manage your code and data together

Project description

DVC logo

WebsiteDocsBlogTwitterChat (Community & Support)TutorialMailing List

Travis Code Climate Codecov Donate Conda-forge

Data Version Control or DVC is an open-source tool for data science and machine learning projects. Key features:

  1. simple command line Git-like experience. Does not require installing and maintaining any databases. Does not depend on any proprietary online services;

  2. it manages and versions datasets and machine learning models. Data is saved in S3, Google cloud, Azure, Alibaba cloud, SSH server, HDFS or even local HDD RAID;

  3. it makes projects reproducible and shareable, it helps answering question how the model was build;

  4. it helps manage experiments with Git tags or branches and metrics tracking;

DVC aims to replace tools like Excel and Google Docs that are being commonly used as a knowledge repo and a ledger for the team, ad-hoc scripts to track and move deploy different model versions, ad-hoc data file suffixes and prefixes.

How DVC works

We encourage you to read our Get Started to better understand what DVC is and how does it fit your scenarios.

The easiest (but not perfect!) analogy to describe it: DVC is Git (or Git-lfs to be precise) + makefiles made right and tailored specifically for ML and Data Science scenarios.

  1. Git/Git-lfs part - DVC helps you storing and sharing data artifacts, models. It connects them with your Git repository.

  2. Makefiles part - DVC describes how one data or model artifact was build from another data.

DVC usually runs along with Git. Git is used as usual to store and version code and DVC meta-files. DVC helps to store data and model files seamlessly out of Git while preserving almost the same user experience as if they were stored in Git itself. To store and share data files cache DVC supports remotes - any cloud (S3, Azure, Google Cloud, etc) or any on-premise network storage (via SSH, for example).

how_dvc_works

DVC pipelines (aka computational graph) feature connects code and data together. In a very explicit way you can specify, run, and save information that a certain command with certain dependencies needs to be run to produce a model. See the quick start section below or check Get Started tutorial to learn more.

Quick start

Please read Get Started for the full version. Common workflow commands include:

Step

Command

Track data

$ git add train.py
$ dvc add images.zip

Connect code and data by commands

$ dvc run -d images.zip -o images/ unzip -q images.zip
$ dvc run -d images/ -d train.py -o model.p python train.py

Make changes and reproduce

$ vi train.py
$ dvc repro model.p.dvc

Share code

$ git add .
$ git commit -m 'The baseline model'
$ git push

Share data and ML models

$ dvc remote add myremote -d s3://mybucket/image_cnn
$ dvc push

Installation

Read this instruction to get more details. There are four options to install DVC: pip, Homebrew, Conda (Anaconda) or an OS-specific package:

pip (PyPI)

pip install dvc

Depending on the remote storage type you plan to use to keep and share your data, you might need to specify one of the optional dependencies: s3, gs, azure, oss, ssh. Or all to include them all. The command should look like this: pip install dvc[s3] - it installs the boto3 library along with DVC to support the AWS S3 storage.

To install the development version, run:

pip install git+git://github.com/iterative/dvc

Homebrew

brew install iterative/homebrew-dvc/dvc

or:

brew cask install iterative/homebrew-dvc/dvc

Conda (Anaconda)

conda install -c conda-forge dvc

Currently, it supports only python version 2.7, 3.6 and 3.7.

Package

Self-contained packages for Windows, Linux, Mac are available. The latest version of the packages can be found at GitHub releases page.

Ubuntu / Debian (deb)

sudo wget https://dvc.org/deb/dvc.list -O /etc/apt/sources.list.d/dvc.list
sudo apt-get update
sudo apt-get install dvc

Fedora / CentOS (rpm)

sudo wget https://dvc.org/rpm/dvc.repo -O /etc/yum.repos.d/dvc.repo
sudo yum update
sudo yum install dvc

Arch Linux (AUR)

Unofficial package, any inquiries regarding the AUR package, refer to the maintainer.

yay -S dvc

Contributing

Contributions are welcome! Please see our Contributing Guide for more details.

0 1 2 3 4 5 6 7

Mailing List

Want to stay up to date? Want to help improve DVC by participating in our occasional polls? Subscribe to our mailing list. No spam, really low traffic.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dvc-0.62.0.tar.gz (200.5 kB view details)

Uploaded Source

Built Distribution

dvc-0.62.0-py2.py3-none-any.whl (275.4 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file dvc-0.62.0.tar.gz.

File metadata

  • Download URL: dvc-0.62.0.tar.gz
  • Upload date:
  • Size: 200.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.1

File hashes

Hashes for dvc-0.62.0.tar.gz
Algorithm Hash digest
SHA256 d3cbe6b450f61d337bcace5136f1ac188697daacaa2fddf5dd24ab5276a94509
MD5 7028642328456731be4b27a99fe52edf
BLAKE2b-256 fe81549a991beeb5da3ad8ea8f29b13ae7c31519d2497ee7c45681159f115e26

See more details on using hashes here.

File details

Details for the file dvc-0.62.0-py2.py3-none-any.whl.

File metadata

  • Download URL: dvc-0.62.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 275.4 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.4.0 requests-toolbelt/0.9.1 tqdm/4.36.1 CPython/3.7.1

File hashes

Hashes for dvc-0.62.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 25c671f206126e9273759ab985b86485ca3dc31c7e18dc8cd05623a553cc48fa
MD5 7853dc471ddbc2434de463f055ec3c73
BLAKE2b-256 189d3cf43549b01c80287666d3e251fe3cd4725dcbdf1cbc7e628f83d1e07558

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page