Skip to main content

Python lib for interacting with an instance of the Tapis API Framework

Project description

tapipy - Tapis V3 Python SDK

Python library for interacting with an instance of the Tapis API Framework.

The library is automatically generated by referencing the OpenAPI spec files which a Tapis object built from the OpenAPI spec files from TACC's Tapis services. With this functionality a user is able to authorize itself with the Tapis object and have a 'live' library in order to interact with Tapis services.

Development

This project is under active development, exploring different approaches to SDK generation.

Installation

Tapipy is packaged on pypi and can be installed with pip.

pip install tapipy

Usage

Tapipy's Tapis object first must be initialized in order to be used. A basic example of logging in with a user account is below.

# Import the Tapis object
from tapipy.tapis import Tapis

# Log into you the Tapis service by providing user/pass and the base url of your tenant. For example, to interact with the tacc tenant --
t = Tapis(base_url='https://tacc.tapis.io',
          username='myuser',
          password='mypass')
	  
# Get tokens that will be used for authentication function calls
t.get_tokens()

Now you have a Tapis object that is authenticated and able to call Tapis service endpoints. It's useful to know that the Tapis object will automatically refresh it's token if it is deemed appropriate, so the object should stay in the good graces of Tapis indefinitely.

Now in order to use the Tapis object you can reference the Tapis Framework to browse all functions. For example, if I wanted to use the SK service in order to check if a user has a specific role I would find the function on the site (which is just a better way to look at the json specs).

With the site I can see that I need to use my Tapis object, initialized as t, access sk, and then use the hasRole function with the required inputs as follows.

t.sk.hasRole(tenant='dev', user='_testuser', roleName='Do you have this role?')

Special Query Parameters and Headers

For the most part, arguments that can or should be passed to a Tapis endpoint are described in the OpenAPI definition files and recognized automatically by tapipy. However, due to limitations in what can be expressed in OpenAPI, there are some paramaters that are not defined in the definition files; for example, the search parameters for various endpoints.

To accommodate these cases, tapipy recognizes two special keyword arguments to all of its methods that correspond to Tapis API calls (i.e., all of its "operations"). They are:

  • _tapis_headers -- dictionary-like object of header names (keys) and vales.
  • _tapis_query_parameters -- dictionary-like object of query parameter names (keys) and values.

Use the above two special arguments for passing headers (respectively, query parameters) that are not specified in the OpenAPI definition of an endpoint.

For example, I can issue a search using the following syntax:

t.jobs.getJobSearchList(limit=5, orderBy='lastUpdated(desc),name(asc)', _tapis_query_parameters={'key': 'value'})

Development Docs

Running the tests

Tests resources are contained within the test directory. Dockerfile-tests is at root.

  1. Build the test docker image: docker build -t tapis/tapipy-tests -f Dockerfile-tests .
  2. Run these tests using the built docker image: docker run -it --rm -e username=<dev_user> -e password=<dev_pass> tapis/tapipy-tests

Important Parameters to Know

The tapipy package allows for spec file customization in Tapis object initialization:

  • resource_set: str
    • Determines which set of resource to use, master or dev, defaults to master.
    • Important to note that if a custom_spec_dictionary is used, it is appended to this resource_set.
      • For example, you would set master and then specify a custom specs that will be added on.
  • custom_spec_dict: {resource_name: str, resource_url: str}
    • Allows users to modify the base resource set urls.
      • e.g. I can specify actor as a resource name and change the url.
    • Also allows users to add new resources to the set.
      • e.g. I can add a new resource named "test" with a custom url.
      • Important that know that any new specs will be downloaded and added to the cache
        • No need to specify download_latest_specs or update spec files.
    • ALLOWS LOCAL RESOURCES!
      • Specify an absolute path in the dict with local: prefixing it and tapipy will load in a local OpenAPI v3 yml spec file.
      • custom_spec_dict={'cactus': 'local: /home/tapis/myfolder/cactusSpec.yml'}
  • download_latest_specs: bool
    • Allows users to re-download all specs regardless on if they already exist in the cache. Defaulted to False
    • This will happen every time the Tapis object is initialized, it's a tad slower, and can cause live updates to specs.
      • As such, be warned. There are functions to update spec files below.
  • spec_dir: str
    • Allows users to specify folder to save specs to. Defaults to none which uses Tapipy's package folder.
    • If you are updating specs it's wise to use a different folder in order to not modify the base specs.

The following is an example of some custom parameter setting. As you can see, the abaco resource will now use the spec at URL#1, overwriting the resource definition in the master resource set, it'll download it if it doesn't exist. The same for the longhorn resource. This means that the Tapis object will now have access to all specs in master like normal, but with a modified abaco and with a new longhorn resource. All of these are stored at the new spec_dir because I don't want to accidentally overwrite any base specs if I call update_spec_cache() later (talked about in the next section).

from tapipy.tapis import Tapis

t = Tapis(base_url='https://admin.develop.tapis.io',
          tenant_id='admin',
          username='username',
          account_type='user',
          password='password',
          resource_set='admin',
          custom_spec_dict={'abaco': 'URL#1',
                            'longhorn': 'URL#2'},
                            'cactus': 'local: /home/tapis/myfolder/cactusSpec.yml'},
          spec_dir='/home/username/tapipy_specs')
t.get_tokens()

Update Specs Files

The Tapipy package now uses a cache to organize spec dictionaries as pickled files and has the ability to accept custom spec files. By default Tapipy keeps a set of base spec files in the %tapipy%/specs folder. These specs are pre-pickled at package creation time.

In order to update all default spec files a user can use the update_spec_cache() function. Said function's definition is below. If no resources are provided the function will download all default spec urls in the RESOURCES object in %tapipy%/tapipy/tapis.py file.

Resources = Dict[ResourceName, ResourceUrl]
update_spec_cache(resources: Resources = None, spec_dir: str = None)

Users are able to specify custom resources to download by providing their own resource dictionary. For example, providing {'actors': 'URLToMyActorDictionary'} would update that spec.

Users can also specify here where to update the spec with the spec_dir variable.

The Tapis object itself also has a update_spec_cache() function that takes the Tapis parameters given at startup and updates the spec cache. Meaning that if the Tapis object was given a custom dictionary, the update_spec_cache() function would update it without the need for setting parameters.

t.update_spec_cache()

Build instructions

Building is done with poetry as follows:

pip install poetry
poetry install

This installs tapipy to a virtual environment. Run a shell in this environment with:

poetry shell

To install locally (not in a virtual environment):

pip install poetry
poetry build
cd dists
pip install *.whl

PyPi Push Instructions

poetry build
poetry publish

Archive Usage

TODO - provide working examples, e.g.,

import tapipy
t = tapipy.Tapis(base_url='http://localhost:5001')
req = t.tokens.NewTokenRequest(token_type='service', token_tenant_id='dev', token_username='admin')
t.tokens.create_token(req)

import openapi_client
configuration = openapi_client.Configuration()
configuration.host = 'http://localhost:5001'
api_instance = openapi_client.TokensApi(openapi_client.ApiClient(configuration))

new_token = openapi_client.NewTokenRequest(token_type='service', token_tenant_id='dev', token_username='admin')

resp = api_instance.create_token(new_token)
jwt = resp.get('result').get('access_token').get('access_token')

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tapipy-1.2.11.tar.gz (171.2 kB view details)

Uploaded Source

Built Distribution

tapipy-1.2.11-py3-none-any.whl (185.4 kB view details)

Uploaded Python 3

File details

Details for the file tapipy-1.2.11.tar.gz.

File metadata

  • Download URL: tapipy-1.2.11.tar.gz
  • Upload date:
  • Size: 171.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.7 CPython/3.8.10 Linux/5.14.0-1051-oem

File hashes

Hashes for tapipy-1.2.11.tar.gz
Algorithm Hash digest
SHA256 c44ca0df4225004cc052ab2ddd642ed03146702c97fd55082f06d1f2580ce566
MD5 caf746898827c365fce63605355b897a
BLAKE2b-256 edcf1505f0db58b4ea7a3aed22762a7f567c08fe99efd30f1f16f794711704a7

See more details on using hashes here.

File details

Details for the file tapipy-1.2.11-py3-none-any.whl.

File metadata

  • Download URL: tapipy-1.2.11-py3-none-any.whl
  • Upload date:
  • Size: 185.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.7 CPython/3.8.10 Linux/5.14.0-1051-oem

File hashes

Hashes for tapipy-1.2.11-py3-none-any.whl
Algorithm Hash digest
SHA256 f95c403246bd8b5d5159882310f28c529ca4c590d3a3def2af5b0ab215887e09
MD5 bcfc0b6220f16bfac1bdee6a38a33943
BLAKE2b-256 5518fcf6231bcac45023afa1e232fab5e585a39d1fbdf42af1c1cd6eb3ecb5fb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page