Skip to main content

DataDriver API package

Project description

[![Anaconda-Server Badge](https://anaconda.org/octo/pyddapi/badges/installer/conda.svg)](https://conda.anaconda.org/octo) [![Anaconda-Server Badge](https://anaconda.org/octo/pyddapi/badges/platforms.svg)](https://anaconda.org/octo/pyddapi) [![Anaconda-Server Badge](https://anaconda.org/octo/pyddapi/badges/latest_release_date.svg)](https://anaconda.org/octo/pyddapi) [![Anaconda-Server Badge](https://anaconda.org/octo/pyddapi/badges/version.svg)](https://anaconda.org/octo/pyddapi)

# DDAPI introduction - [What is it](#what-is-it) - [Install](#install) - [Contributing](#contributing)

## What is it?

The following section describes the main concepts used in the Data Driver environment.

### Workflow A Data Driver workflow is a network of tasks (python function) linked together. This workflow is typically described as a DAG (Direct Acyclic Graph). The jobs can execute all kind of Python code like data loading, feature engineering, model fitting, alerting, etc.

A workflow can be scheduled and monitored in the Data Driver architecture with the tool Airflow. The Data Driver API adds data science capabilities to Airflow and the capacity to audit the input / output of each task.

### Data Driver API or ddapi

ddapi is a Python library. It is the access layer to Data Driver and you can use it to manipulated datasets and workflows. Some main usages are described below, for more informations and tutorials you can access to the OCTO notebook tutorials repository.

`python import dd `

Ddapi is composed of several modules.

#### DB module

import dd.db

DB is an easier way to interact with your databases. You can use it to explore your databases or import new data.

#### Context module

from dd.api.contexts.distributed import AirflowContext from dd.api.contexts.local import LocalContext

The context is an object which will allow you to communicate with your environment during your exploration. As such, it needs to be able to communicate with your database. This is done by creating a DB object and passing it to the context constructor.

#### Dataset module

import dd.api.workflow.dataset

You may consider datasets as wrappers around Pandas DataFrames. It gives you access to some methods you may recognise if you are familiar with this awesome library.

### Disclaimer

#### It does not / it is not for :

  • Code versionning

  • Enforce good code quality

  • Data quality tool

  • ETL

  • Data Catalog & Data Lineage

  • Data visualisation

  • Datalake

  • Magical stuffs

  • Coffee

#### It is a set of tools unified into a unique platform to accelerate data science :

  • we have made an API that lets DataScientists use the same technologies they use in exploration to do industrialisation, because we saw it was the most impactfull parameter on the success of the project. (DDAPI)

  • monitore Machine Learning models (your code + DDAPI + Airflow)

  • schedule builds of datascience’s pipeline (your code + DDAPI + Airflow)

  • datascience feature engineering functions (your code + BDACore)

  • metrics and datascience helpers to study model shifting (BDACore)

  • integration of open source standards Jupyterhub, Airflow and PostgreSQL together (Lab and Factory machine roles)

## Install

last release

pip install pyddapi

last build from master

pip install -i https://pypi.anaconda.org/octo/label/dev/simple pyddapi

### Developer setup

#### Setup your virtual env

virtualenv venv source venv/bin/activate pip install -e . pip install -r ci/tests_requirements.txt

_ddapi_ only supports python versions 2.7 and 3.6/ Running _ddapi_ with other versions is not advised, so avoid it if possible, or do it at your own risk.

You can find the package in [anaconda cloud repository](https://anaconda.org/octo/pyddapi)

## Contributing In case you want to contribute to the code, do not forget to check our [Developer Guidelines](DEVGUIDE.md)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyddapi-3.0.3.tar.gz (54.1 kB view details)

Uploaded Source

Built Distribution

pyddapi-3.0.3-py2.py3-none-any.whl (74.2 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file pyddapi-3.0.3.tar.gz.

File metadata

  • Download URL: pyddapi-3.0.3.tar.gz
  • Upload date:
  • Size: 54.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.7.0 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.6.8

File hashes

Hashes for pyddapi-3.0.3.tar.gz
Algorithm Hash digest
SHA256 e8ea4ffe95719d9bbe16ecd8478b7d2514c22933c537ea85d5395db03f0455cd
MD5 0fa0a0ac369ca8642a6fd11f2140a68a
BLAKE2b-256 ee424429ea6261179b25ab291599ee7f71c217b123a9ac72ffee357fbb061d55

See more details on using hashes here.

File details

Details for the file pyddapi-3.0.3-py2.py3-none-any.whl.

File metadata

  • Download URL: pyddapi-3.0.3-py2.py3-none-any.whl
  • Upload date:
  • Size: 74.2 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.7.0 requests-toolbelt/0.9.1 tqdm/4.32.1 CPython/3.6.8

File hashes

Hashes for pyddapi-3.0.3-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 9fccebb962407d5ab5d81afcc500049568e478939a3d82d10b2eaac6953a1426
MD5 ac73cf829252b52798ed807906adf214
BLAKE2b-256 7b3872851a1968ea4da4483905c84673b246b5f8118be97f8c1e09cf8f01a140

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page