Skip to main content

Datera Fabric Python SDK

Project description

Datera Python SDK

Introduction

This is Python SDK version v1.2 for the Datera Fabric Services API. Download and use of this package implicitly accepts the terms in COPYING

Users of this package are assumed to have familiarity with the Datera API. Details around the API itself are not necessarily covered through this SDK.

Features

  • Automatic session management and login
  • Automatic and configuable request retries
  • Object to REST request translation
  • Standard Logging Format (compatible with Datera SREQ log parsing)
  • Endpoint validation (toggleable)
  • Dot-notation access to response attributes
  • UDC compliance

Installation

From Source

    apt-get install python-virtualenv (or yum install python-virtualenv for CentOS)
    virtualenv sdk
    source sdk/bin/activate
    git clone https://github.com/Datera/python-sdk.git
    cd python-sdk
    pip install -r requirements.txt
    python setup.py install

From PYPI

    pip install -U dfs_sdk

Universal Datera Config

The Universal Datera Config (UDC) is a config that can be specified in a number of ways:

  • JSON file with any of the following names:
    • .datera-config
    • datera-config
    • .datera-config.json
    • datera-config.json
  • The JSON file has the following configuration:
     {"mgmt_ip": "1.1.1.1",
      "username": "admin",
      "password": "password",
      "tenant": "/root",
      "api_version": "2.3",
      "ldap": ""}
  • The file can be in any of the following places. This is also the lookup order for config files:
    • current directory
    • home directory
    • home/config directory
    • /etc/datera
  • If no datera config file is found and a cinder.conf file is present, the config parser will try and pull connection credentials from the cinder.conf
  • Tenant and API version and LDAP are always optional, but it's generally suggested to include them in your UDC file for easy reference.
  • Instead of a JSON file, environment variables can be used.
    • DAT_MGMT
    • DAT_USER
    • DAT_PASS
    • DAT_TENANT
    • DAT_API
    • DAT_LDAP
  • Most tools built to use the Universal Datera Config will also allow for providing/overriding any of the config values via command line flags.
    • --hostname
    • --username
    • --password
    • --tenant
    • --api-version
    • --ldap

Developing with Universal Datera Config

To use UDC in a new python tool is very simple just add the following to your python script:

from dfs_sdk import scaffold

parser = scaffold.get_argparser()
parser.add_argument('my-new-arg')
args = parser.parse_args()

If you want to use subparsers, or customize the help outptu of your parser then use the following

import argparse
from dfs_sdk import scaffold

top_parser = scaffold.get_argparser(add_help=False)
new_parser = argparse.ArgumentParser(parents=[top_parser])
new_parser.add_argument('my-new-arg')
args = new_parser.parse_args()

Inside a script the config can be recieved by calling

from dfs_sdk import scaffold

scaffold.get_argparser()
config = scaffold.get_config()

NOTE: It is no longer required to call scaffold.get_argparser() before calling scaffold.get_config(). This is only necessary if building a CLI tool that needs the cli parser.

Logging

To set custom logging.json file

    export DSDK_LOG_CFG=your/log/location.json

Or the value can be set to a debug, info or error

    export DSDK_LOG_CFG=info

To set logging to stdout. The value can be any logging level supported by the python logging module (eg: debug, info, etc)

    export DSDK_LOG_STDOUT=debug

The debug logs generated by the python-sdk are quite large, and are on a rotating file handler (provided that a custom logging.json file is not provided)

Managed Objects

Datera provides an application-driven storage management model, whose goal is to closely align storage with a corresponding application's requirements.

The main storage objects are defined and differentiated as follows:

Application Instance (AppInstance)

-    Corresponds to an application, service, etc.
-    Contains Zero or more Storage Instances

Storage Instance

-    Corresponds to one set of storage requirements for a given AppInstance
-    ACL Policies, including IQN Initiators
-    Target IQN
-    Contains Zero or more Volumes

Volumes

-    Corresponds to a single allocated storage object
-    Size (default unit is GB)
-    Replication Factor
-    Performance Policies (QoS for Bandwidth and IOPS)
-    Protection Policies (Snapshot scheduling)

Another way of viewing the managed object hierarchy is as follows:

app_instances:
    - storage_instances:                 (1 or more per app_instance)
        + acl_policy                     (1 or more host initiators )
        + iqn                            (target IQN)
        + ips                            (target IPs)
        + volumes:                       (1 or more per storage_instance)
            * name
            * size
            * replication
            * performance_policy         (i.e. QoS)
            * protection_policy          (i.e. Snapshot schedules)

Endpoints

HTTP operations on URL endpoints is the only way to interact with the set of managed objects. URL's have the format:

      http://192.168.42.13:7717/v2.3/<object_class>/[<instance>]/...

where 7717 is the port used to access the API, and "v2.3" corresponds to an API version control.

Briefly, the REST API supports 4 operations/methods create (POST), modify (PUT), list (GET), delete (DELETE). Any input payload is in JSON format; any return payload is in JSON format. Login session keys are required within the "header" of any HTTP request. Sessions keys have a 15 minute lifetime.

For a full reference documentation of the REST API, please review the Datera REST API Guide.

This Python SDK serves as a wrapper around the raw HTTP layer.

Using this SDK

The Datera module is named dfs_sdk, and the main entry point is called DateraApi. Obtaining an object handle can be done as follows:

    from dfs_sdk import get_api
    [...]
    api = get_api(mgmt_ip, username, password, "v2.3" **kwargs)

You can also initialize the SDK using a Datera UDC file. The following will read any valid UDC file on the system or from the current environment variables.

    from dfs_sdk.scaffold import get_api
    [...]
    api = get_api()

Configurable Options

These options can be set on instantiation via the get_api constructor

Option Default Description
tenant '/root' Datera account tenant/subtenant
timeout 300 (s) Timeout for HTTP requests
secure True Whether to use HTTPS (False sets HTTP)
strict False Whether to check if an endpoint is valid before sending request
cert None HTTPS verification certificate
cert_key None HTTPS verification certificate key
thread_local {} Used for passing values down to the connection layer, usually for logging

Common Objects, Examples and Use Cases

Please see the utils directory for programming examples that cover the following:

Common methods for all objects include create(), set(), delete(), list()

  • To create an app_instance with name FOO:
        ai = api.app_instances.create(name="FOO")
  • Looping through objects can be done via list():
        for ai in api.app_instances.list():
            print "AppInstance: ", ai
  • To set a given app_instance into an offline state:
        ai.set(admin_state="offline")
  • To delete a given app_instance:
        ai.delete()

Building the PyPI package

Run the following to build the packages (if uploading, ensure the version is incremented in constants.py)

        python setup.py sdist bdist_wheel

Then to upload the package to PyPI (this step requires valid PyPI credentials)

        twine upload dist/*

You can perform a test upload by running. This requires credentials on the test PyPI server

        twine upload --repository-url https://test.pypi.org/legacy/ dist/*

Reporting Problems

For problems and feedback, please open an github issue. This project is community supported.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dfs_sdk-1.2.27.tar.gz (27.2 kB view details)

Uploaded Source

Built Distribution

dfs_sdk-1.2.27-py2-none-any.whl (36.3 kB view details)

Uploaded Python 2

File details

Details for the file dfs_sdk-1.2.27.tar.gz.

File metadata

  • Download URL: dfs_sdk-1.2.27.tar.gz
  • Upload date:
  • Size: 27.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.24.0 setuptools/49.2.0 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.5

File hashes

Hashes for dfs_sdk-1.2.27.tar.gz
Algorithm Hash digest
SHA256 7643f5c2fe9c57f358cebc33e5d80dacc529989e7633334a5c1041aadff7fc44
MD5 a31d6022c752812ae84f8c0e8efa4496
BLAKE2b-256 7a79064994d0685c320d110b59d8898881981caf6d4aa43884289bb94c0e6867

See more details on using hashes here.

File details

Details for the file dfs_sdk-1.2.27-py2-none-any.whl.

File metadata

  • Download URL: dfs_sdk-1.2.27-py2-none-any.whl
  • Upload date:
  • Size: 36.3 kB
  • Tags: Python 2
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.24.0 setuptools/49.2.0 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.5

File hashes

Hashes for dfs_sdk-1.2.27-py2-none-any.whl
Algorithm Hash digest
SHA256 1c83b2a18ea32a41ecf3c24dfff0f7051bde929982916e38d3b35c5978b871a9
MD5 92566dc5db4bbc47502dfb243e16c07f
BLAKE2b-256 735287611dc4973a59dbc70d426f97ecfcd96bca914b070286d1f2fcbc7f1013

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page