Skip to main content

Python library for easily interacting with trained machine learning models

Project description

gradio_client: Use any Gradio app as an API -- in 3 lines of Python

This directory contains the source code for gradio_client, a lightweight Python library that makes it very easy to use any Gradio app as an API. Warning: This library is currently in alpha, and APIs may change.

As an example, consider the Stable Diffusion Gradio app, which is hosted on Hugging Face Spaces, and which generates images given a text prompt. Using the gradio_client library, we can easily use the Gradio as an API to generates images programmatically.

Here's the entire code to do it:

import gradio_client as grc

client = grc.Client("stabilityai/stable-diffusion")
job = client.predict("a hyperrealistic portrait of a cat wearing cyberpunk armor", "", fn_index=1)
job.result()

>> /Users/usersname/b8c26657-df87-4508-aa75-eb37cd38735f  # Path to generatoed gallery of images

Installation

If you already have a recent version of gradio, then the gradio_client is included as a dependency.

Otherwise, the lightweight gradio_client package can be installed from pip (or pip3) and works with Python versions 3.9 or higher:

$ pip install gradio_client

Usage

Connecting to a Space or a Gradio app

Start by connecting instantiating a Client object and connecting it to a Gradio app that is running on Spaces (or anywhere else)!

Connecting to a Space

import gradio_client as grc

client = grc.Client("abidlabs/en2fr")

Connecting a general Gradio app

If your app is running somewhere else, provide the full URL instead to the src argument. Here's an example of making predictions to a Gradio app that is running on a share URL:

import gradio_client as grc

client = grc.Client(src="btd372-js72hd.gradio.app")

Making a prediction

The simplest way to make a prediction is simply to call the .predict() function with the appropriate arguments and then immediately calling .result(), like this:

import gradio_client as grc

client = grc.Client(space="abidlabs/en2fr")

client.predict("Hello").result()

>> Bonjour

Running jobs asyncronously

Oe should note that .result() is a blocking operation as it waits for the operation to complete before returning the prediction.

In many cases, you may be better off letting the job run asynchronously and waiting to call .result() when you need the results of the prediction. For example:

import gradio_client as grc

client = grc.Client(space="abidlabs/en2fr")

job = client.predict("Hello")

# Do something else

job.result()

>> Bonjour

Adding callbacks

Alternatively, one can add callbacks to perform actions after the job has completed running, like this:

import gradio_client as grc


def print_result(x):
    print("The translated result is: {x}")

client = grc.Client(space="abidlabs/en2fr")

job = client.predict("Hello", callbacks=[print_result])

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gradio_client-0.0.7.tar.gz (14.5 kB view hashes)

Uploaded Source

Built Distribution

gradio_client-0.0.7-py3-none-any.whl (14.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page