Skip to main content

DataDriver API package

Project description

[![Anaconda-Server Badge](https://anaconda.org/octo/pyddapi/badges/installer/conda.svg)](https://conda.anaconda.org/octo) [![Anaconda-Server Badge](https://anaconda.org/octo/pyddapi/badges/platforms.svg)](https://anaconda.org/octo/pyddapi) [![Anaconda-Server Badge](https://anaconda.org/octo/pyddapi/badges/latest_release_date.svg)](https://anaconda.org/octo/pyddapi) [![Anaconda-Server Badge](https://anaconda.org/octo/pyddapi/badges/version.svg)](https://anaconda.org/octo/pyddapi)

# DDAPI introduction - [What is it](#what-is-it) - [Install](#install) - [Contributing](#contributing)

## What is it?

The following section describes the main concepts used in the Data Driver environment.

### Workflow A Data Driver workflow is a network of tasks (python function) linked together. This workflow is typically described as a DAG (Direct Acyclic Graph). The jobs can execute all kind of Python code like data loading, feature engineering, model fitting, alerting, etc.

A workflow can be scheduled and monitored in the Data Driver architecture with the tool Airflow. The Data Driver API adds data science capabilities to Airflow and the capacity to audit the input / output of each task.

### Data Driver API or ddapi

ddapi is a Python library. It is the access layer to Data Driver and you can use it to manipulated datasets and workflows. Some main usages are described below, for more informations and tutorials you can access to the OCTO notebook tutorials repository.

`python import dd `

Ddapi is composed of several modules.

#### DB module

import dd.db

DB is an easier way to interact with your databases. You can use it to explore your databases or import new data.

#### Context module

from dd.api.contexts.distributed import AirflowContext from dd.api.contexts.local import LocalContext

The context is an object which will allow you to communicate with your environment during your exploration. As such, it needs to be able to communicate with your database. This is done by creating a DB object and passing it to the context constructor.

#### Dataset module

import dd.api.workflow.dataset

You may consider datasets as wrappers around Pandas DataFrames. It gives you access to some methods you may recognise if you are familiar with this awesome library.

### Disclaimer

#### It does not / it is not for :

  • Code versionning

  • Enforce good code quality

  • Data quality tool

  • ETL

  • Data Catalog & Data Lineage

  • Data visualisation

  • Datalake

  • Magical stuffs

  • Coffee

#### It is a set of tools unified into a unique platform to accelerate data science :

  • we have made an API that lets DataScientists use the same technologies they use in exploration to do industrialisation, because we saw it was the most impactfull parameter on the success of the project. (DDAPI)

  • monitore Machine Learning models (your code + DDAPI + Airflow)

  • schedule builds of datascience’s pipeline (your code + DDAPI + Airflow)

  • datascience feature engineering functions (your code + BDACore)

  • metrics and datascience helpers to study model shifting (BDACore)

  • integration of open source standards Jupyterhub, Airflow and PostgreSQL together (Lab and Factory machine roles)

## Install

last release

pip install pyddapi

last build from master

pip install -i https://pypi.anaconda.org/octo/label/dev/simple pyddapi

### Developer setup

#### Setup your virtual env

virtualenv venv source venv/bin/activate pip install -e . pip install -r ci/tests_requirements.txt

_ddapi_ only supports python versions 2.7 and 3.6/ Running _ddapi_ with other versions is not advised, so avoid it if possible, or do it at your own risk.

You can find the package in [anaconda cloud repository](https://anaconda.org/octo/pyddapi)

## Contributing In case you want to contribute to the code, do not forget to check our [Developer Guidelines](DEVGUIDE.md)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyddapi-3.0.3.tar.gz (54.1 kB view hashes)

Uploaded Source

Built Distribution

pyddapi-3.0.3-py2.py3-none-any.whl (74.2 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page