Skip to main content

inferrd.com

Project description

Inferrd

Inferrd is a hosting platform for TensorFlow.

Authentication

In order to use this library you need to get a api token from inferrd.com

Authenticate with the inferrd.auth method:

import inferrd

inferrd.auth('<token>')

Deploying TensorFlow

First, create a model on inferrd.com and select the kind of instance you want. Then simple call inferrd.deploy_tf:

import inferrd

# this only needs to be done once
inferrd.auth('<token>')

# deploy TF
inferrd.deploy_tf(tf_model, '<name of the model>')

Fetching predictions

Inferrd allows us to pull predictions back into your notebook by using inferrd.get_requests:

import inferrd

# this only needs to be done once
inferrd.auth('<token>')

# get the requests
requests = inferrd.get_requests('<name of the model>', limit=100, page=0, includeFailures=False)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

inferrd-0.1.47.tar.gz (9.3 kB view hashes)

Uploaded Source

Built Distribution

inferrd-0.1.47-py3-none-any.whl (10.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page