Skip to main content

Python library for easily interacting with trained machine learning models

Project description

modelly_client: Use a Modelly app as an API -- in 3 lines of Python

This directory contains the source code for modelly_client, a lightweight Python library that makes it very easy to use any Modelly app as an API.

As an example, consider this Hugging Face Space that transcribes audio files that are recorded from the microphone.

Using the modelly_client library, we can easily use the Modelly as an API to transcribe audio files programmatically.

Here's the entire code to do it:

from modelly_client import Client

client = Client("abidlabs/whisper")
client.predict("audio_sample.wav")

>> "This is a test of the whisper speech recognition model."

The Modelly client works with any Modelly Space, whether it be an image generator, a stateful chatbot, or a tax calculator.

Installation

If you already have a recent version of modelly, then the modelly_client is included as a dependency.

Otherwise, the lightweight modelly_client package can be installed from pip (or pip3) and works with Python versions 3.10 or higher:

$ pip install modelly_client

Basic Usage

Connecting to a Space or a Modelly app

Start by connecting instantiating a Client object and connecting it to a Modelly app that is running on Spaces (or anywhere else)!

Connecting to a Space

from modelly_client import Client

client = Client("abidlabs/en2fr")  # a Space that translates from English to French

You can also connect to private Spaces by passing in your HF token with the hf_token parameter. You can get your HF token here: https://huggingface.co/settings/tokens

from modelly_client import Client

client = Client("abidlabs/my-private-space", hf_token="...")

Duplicating a Space for private use

While you can use any public Space as an API, you may get rate limited by Hugging Face if you make too many requests. For unlimited usage of a Space, simply duplicate the Space to create a private Space, and then use it to make as many requests as you'd like!

The modelly_client includes a class method: Client.duplicate() to make this process simple:

from modelly_client import Client

client = Client.duplicate("abidlabs/whisper")
client.predict("audio_sample.wav")

>> "This is a test of the whisper speech recognition model."

If you have previously duplicated a Space, re-running duplicate() will not create a new Space. Instead, the Client will attach to the previously-created Space. So it is safe to re-run the Client.duplicate() method multiple times.

Note: if the original Space uses GPUs, your private Space will as well, and your Hugging Face account will get billed based on the price of the GPU. To minimize charges, your Space will automatically go to sleep after 1 hour of inactivity. You can also set the hardware using the hardware parameter of duplicate().

Connecting a general Modelly app

If your app is running somewhere else, just provide the full URL instead, including the "http://" or "https://". Here's an example of making predictions to a Modelly app that is running on a share URL:

from modelly_client import Client

client = Client("https://bec81a83-5b5c-471e.modelly.live")

Inspecting the API endpoints

Once you have connected to a Modelly app, you can view the APIs that are available to you by calling the .view_api() method. For the Whisper Space, we see the following:

Client.predict() Usage Info
---------------------------
Named API endpoints: 1

 - predict(input_audio, api_name="/predict") -> value_0
    Parameters:
     - [Audio] input_audio: str (filepath or URL)
    Returns:
     - [Textbox] value_0: str (value)

This shows us that we have 1 API endpoint in this space, and shows us how to use the API endpoint to make a prediction: we should call the .predict() method, providing a parameter input_audio of type str, which is a filepath or URL.

We should also provide the api_name='/predict' argument. Although this isn't necessary if a Modelly app has a single named endpoint, it does allow us to call different endpoints in a single app if they are available. If an app has unnamed API endpoints, these can also be displayed by running .view_api(all_endpoints=True).

Making a prediction

The simplest way to make a prediction is simply to call the .predict() function with the appropriate arguments:

from modelly_client import Client

client = Client("abidlabs/en2fr")
client.predict("Hello")

>> Bonjour

If there are multiple parameters, then you should pass them as separate arguments to .predict(), like this:

from modelly_client import Client

client = Client("modelly/calculator")
client.predict(4, "add", 5)

>> 9.0

For certain inputs, such as images, you should pass in the filepath or URL to the file. Likewise, for the corresponding output types, you will get a filepath or URL returned.

from modelly_client import Client

client = Client("abidlabs/whisper")
client.predict("https://audio-samples.github.io/samples/mp3/blizzard_unconditional/sample-0.mp3")

>> "My thought I have nobody by a beauty and will as you poured. Mr. Rochester is serve in that so don't find simpus, and devoted abode, to at might in a r—"

Advanced Usage

For more ways to use the Modelly Python Client, check out our dedicated Guide on the Python client, available here: https://www.modelly.khulnasoft.com/guides/getting-started-with-the-python-client

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

modelly_client-1.0.0.tar.gz (303.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

modelly_client-1.0.0-py3-none-any.whl (307.3 kB view details)

Uploaded Python 3

File details

Details for the file modelly_client-1.0.0.tar.gz.

File metadata

  • Download URL: modelly_client-1.0.0.tar.gz
  • Upload date:
  • Size: 303.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.1

File hashes

Hashes for modelly_client-1.0.0.tar.gz
Algorithm Hash digest
SHA256 fe51b24ae5f0c99f0d5cb19ceb21c1e88cc72b0fccca5455b91af668f951cace
MD5 44ced8e4a2c207d5796733a2639e7186
BLAKE2b-256 4c83973c9da067b27e559edc78124afca01bb7f270a40a1e924d58e468eab42f

See more details on using hashes here.

File details

Details for the file modelly_client-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: modelly_client-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 307.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.1

File hashes

Hashes for modelly_client-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 447fed454c6a7241dd0f01a688b5fe94e782a18d2a488ff9aaff53613b7890f5
MD5 361483103f4c93be0901c1d23b5884f1
BLAKE2b-256 50674b0735782c79ff9cfd808d64faf2bd27b16e5b16e40d4213125b619072e0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page