Skip to main content

A Python SDK for Hume AI

Project description

Hume AI Python SDK

Integrate Hume APIs directly into your Python application



Migration Guide for Version 0.7.0 and Above

There were major breaking changes in version 0.7.0 of the SDK. If upgrading from a previous version, please View the Migration Guide. That release deprecated several interfaces and moved them to the hume[legacy] package extra. The legacy extra was removed in 0.9.0. The last version to include legacy was 0.8.6.

Documentation

API reference documentation is available here.

Compatibility

The Hume Python SDK is compatible across several Python versions and operating systems.

Below is a table which shows the version and operating system compatibilities by product:

Python Version Operating System
Empathic Voice Interface 3.9, 3.10, 3.11, 3.12, 3.13 macOS, Linux
Text-to-speech (TTS) 3.9, 3.10, 3.11, 3.12, 3.13 macOS, Linux, Windows
Expression Measurement 3.9, 3.10, 3.11, 3.12, 3.13 macOS, Linux, Windows

Installation

pip install hume
# or
poetry add hume
# or
uv add hume

Other Resources

from hume.client import HumeClient

client = HumeClient(api_key="YOUR_API_KEY")
client.empathic_voice.configs.list_configs()

Async Client

The SDK also exports an async client so that you can make non-blocking calls to our API.

import asyncio

from hume.client import AsyncHumeClient

client = AsyncHumeClient(api_key="YOUR_API_KEY")

async def main() -> None:
    await client.empathic_voice.configs.list_configs()

asyncio.run(main())

Writing File

Writing files with an async stream of bytes can be tricky in Python! aiofiles can simplify this some. For example, you can download your job artifacts like so:

import aiofiles

from hume import AsyncHumeClient

client = AsyncHumeClient()
async with aiofiles.open('artifacts.zip', mode='wb') as file:
    async for chunk in client.expression_measurement.batch.get_job_artifacts(id="my-job-id"):
        await file.write(chunk)

Namespaces

This SDK contains the APIs for empathic voice, tts, and expression measurement. Even if you do not plan on using more than one API to start, the SDK provides easy access in case you would like to use additional APIs in the future.

Each API is namespaced accordingly:

from hume.client import HumeClient

client = HumeClient(api_key="YOUR_API_KEY")

client.emapthic_voice.         # APIs specific to Empathic Voice
client.tts.                    # APIs specific to Text-to-speech
client.expression_measurement. # APIs specific to Expression Measurement

Exception Handling

All errors thrown by the SDK will be subclasses of ApiError.

import hume.client

try:
  client.expression_measurement.batch.get_job_predictions(...)
except hume.core.ApiError as e: # Handle all errors
  print(e.status_code)
  print(e.body)

Pagination

Paginated requests will return a SyncPager or AsyncPager, which can be used as generators for the underlying object. For example, list_tools will return a generator over ReturnUserDefinedTool and handle the pagination behind the scenes:

import hume.client

client = HumeClient(api_key="YOUR_API_KEY")

for tool in client.empathic_voice.tools.list_tools():
  print(tool)

you could also iterate page-by-page:

for page in client.empathic_voice.tools.list_tools().iter_pages():
  print(page.items)

or manually:

pager = client.empathic_voice.tools.list_tools()
# First page
print(pager.items)
# Second page
pager = pager.next_page()
print(pager.items)

WebSockets

We expose a websocket client for interacting with the EVI API as well as Expression Measurement.

When interacting with these clients, you can use them very similarly to how you'd use the common websockets library:

from hume import StreamDataModels

client = AsyncHumeClient(api_key=os.getenv("HUME_API_KEY"))

async with client.expression_measurement.stream.connect(
    options={"config": StreamDataModels(...)}
) as hume_socket:
    print(await hume_socket.get_job_details())

The underlying connection, in this case hume_socket, will support intellisense/autocomplete for the different functions that are available on the socket!

Advanced

Retries

The Hume SDK is instrumented with automatic retries with exponential backoff. A request will be retried as long as the request is deemed retriable and the number of retry attempts has not grown larger than the configured retry limit.

A request is deemed retriable when any of the following HTTP status codes is returned:

  • 408 (Timeout)
  • 409 (Conflict)
  • 429 (Too Many Requests)
  • 5XX (Internal Server Errors)

Use the max_retries request option to configure this behavior.

from hume.client import HumeClient
from hume.core import RequestOptions

client = HumeClient(...)

# Override retries for a specific method
client.expression_measurement.batch.get_job_predictions(...,
    request_options=RequestOptions(max_retries=5)
)

Timeouts

By default, requests time out after 60 seconds. You can configure this with a timeout option at the client or request level.

from hume.client import HumeClient
from hume.core import RequestOptions

client = HumeClient(
    # All timeouts are 20 seconds
    timeout=20.0,
)

# Override timeout for a specific method
client.expression_measurement.batch.get_job_predictions(...,
    request_options=RequestOptions(timeout_in_seconds=20)
)

Custom HTTP client

You can override the httpx client to customize it for your use-case. Some common use-cases include support for proxies and transports.

import httpx

from hume.client import HumeClient

client = HumeClient(
    http_client=httpx.Client(
        proxies="http://my.test.proxy.example.com",
        transport=httpx.HTTPTransport(local_address="0.0.0.0"),
    ),
)

Contributing

While we value open-source contributions to this SDK, this library is generated programmatically.

Additions made directly to this library would have to be moved over to our generation code, otherwise they would be overwritten upon the next generated release. Feel free to open a PR as a proof of concept, but know that we will not be able to merge it as-is. We suggest opening an issue first to discuss with us!

On the other hand, contributions to the README are always very welcome!

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hume-0.13.8.tar.gz (142.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hume-0.13.8-py3-none-any.whl (353.0 kB view details)

Uploaded Python 3

File details

Details for the file hume-0.13.8.tar.gz.

File metadata

  • Download URL: hume-0.13.8.tar.gz
  • Upload date:
  • Size: 142.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for hume-0.13.8.tar.gz
Algorithm Hash digest
SHA256 067691b0ce0353e4438d32d5fbfcbb6ed2099533bf5e06af99084c8c76fad24f
MD5 095d49c853bfb60a549588c0f7caabf5
BLAKE2b-256 935b849ac072161e985ce5758f19f792043274b64a9f9dd73fdd14333b7446f4

See more details on using hashes here.

File details

Details for the file hume-0.13.8-py3-none-any.whl.

File metadata

  • Download URL: hume-0.13.8-py3-none-any.whl
  • Upload date:
  • Size: 353.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for hume-0.13.8-py3-none-any.whl
Algorithm Hash digest
SHA256 8295c095e4e04918512eec2df3adf4a0900b8d7ef06e3e8487c45ab520ed0ad5
MD5 261dbea7a477992695fa81d6862e3a42
BLAKE2b-256 4810ec2c1e9a0401a39c3575ff8c5e42ad4b03687d5dbdefaa94ec5d52dbe088

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page