Skip to main content

The Humanloop API allows you to interact with Humanloop from your product or service. You can do this through HTTP requests from any language or via our official Python SDK. To install the official Python SDK, run the following command: ```bash pip install humanloop ``` --- Guides and further details about key concepts can be found in [our docs](https://humanloop.gitbook.io/humanloop-docs/).

Project description

humanloop@0.4.0a4

Requirements

Python >=3.7

Installing

pip install humanloop==0.4.0a4

Getting Started

from pprint import pprint
from humanloop import Humanloop, ApiException

humanloop = Humanloop(
    # Defining the host is optional and defaults to https://api.humanloop.com/v4
    # See configuration.py for a list of all supported configuration parameters.
    host="https://api.humanloop.com/v4",
    api_key="YOUR_API_KEY",
)

try:
    # Chat
    chat_response = humanloop.chat(
        project="sdk-example",
        messages=[
            {
                "role": "user",
                "content": "Explain asynchronous programming.",
            }
        ],
        model_config={
            "model": "gpt-3.5-turbo",
            "max_tokens": -1,
            "temperature": 0.7,
            "chat_template": [
                {
                    "role": "system",
                    "content": "You are a helpful assistant who replies in the style of {{persona}}.",
                },
            ],
        },
        inputs={
            "persona": "the pirate Blackbeard",
        },
        stream=False,
    )
    pprint(chat_response.body)
    pprint(chat_response.body["project_id"])
    pprint(chat_response.body["data"][0])
    pprint(chat_response.body["provider_responses"])
    pprint(chat_response.headers)
    pprint(chat_response.status)
    pprint(chat_response.round_trip_time)
except ApiException as e:
    print("Exception when calling .response: %s\n" % e)
    pprint(e.body)
    if e.status == 422:
        pprint(e.body["detail"])
    pprint(e.headers)
    pprint(e.status)
    pprint(e.reason)
    pprint(e.round_trip_time)

try:
    # Complete
    complete_response = humanloop.complete(
        project="sdk-example",
        inputs={
            "text": "Llamas that are well-socialized and trained to halter and lead after weaning and are very friendly and pleasant to be around. They are extremely curious and most will approach people easily. However, llamas that are bottle-fed or over-socialized and over-handled as youth will become extremely difficult to handle when mature, when they will begin to treat humans as they treat each other, which is characterized by bouts of spitting, kicking and neck wrestling.[33]",
        },
        model_config={
            "model": "gpt-3.5-turbo",
            "max_tokens": -1,
            "temperature": 0.7,
            "prompt_template": "Summarize this for a second-grade student:\n\nText:\n{{text}}\n\nSummary:\n",
        },
        stream=False,
    )
    pprint(complete_response.body)
    pprint(complete_response.body["project_id"])
    pprint(complete_response.body["data"][0])
    pprint(complete_response.body["provider_responses"])
    pprint(complete_response.headers)
    pprint(complete_response.status)
    pprint(complete_response.round_trip_time)
except ApiException as e:
    print("Exception when calling .create: %s\n" % e)
    pprint(e.body)
    if e.status == 422:
        pprint(e.body["detail"])
    pprint(e.headers)
    pprint(e.status)
    pprint(e.reason)
    pprint(e.round_trip_time)

try:
    # Feedback
    feedback_response = humanloop.feedback(
        type="rating",
        value="good",
        data_id="data_[...]",
        user="user@example.com",
    )
    pprint(feedback_response.body)
    pprint(feedback_response.headers)
    pprint(feedback_response.status)
    pprint(feedback_response.round_trip_time)
except ApiException as e:
    print("Exception when calling .submit: %s\n" % e)
    pprint(e.body)
    if e.status == 422:
        pprint(e.body["detail"])
    pprint(e.headers)
    pprint(e.status)
    pprint(e.reason)
    pprint(e.round_trip_time)

try:
    # Log
    log_response = humanloop.log(
        project="sdk-example",
        inputs={
            "text": "Llamas that are well-socialized and trained to halter and lead after weaning and are very friendly and pleasant to be around. They are extremely curious and most will approach people easily. However, llamas that are bottle-fed or over-socialized and over-handled as youth will become extremely difficult to handle when mature, when they will begin to treat humans as they treat each other, which is characterized by bouts of spitting, kicking and neck wrestling.[33]",
        },
        output="Llamas can be friendly and curious if they are trained to be around people, but if they are treated too much like pets when they are young, they can become difficult to handle when they grow up. This means they might spit, kick, and wrestle with their necks.",
        source="sdk",
        model_config={
            "model": "gpt-3.5-turbo",
            "max_tokens": -1,
            "temperature": 0.7,
            "prompt_template": "Summarize this for a second-grade student:\n\nText:\n{{text}}\n\nSummary:\n",
        },
    )
    pprint(log_response.body)
    pprint(log_response.headers)
    pprint(log_response.status)
    pprint(log_response.round_trip_time)
except ApiException as e:
    print("Exception when calling .log: %s\n" % e)
    pprint(e.body)
    if e.status == 422:
        pprint(e.body["detail"])
    pprint(e.headers)
    pprint(e.status)
    pprint(e.reason)
    pprint(e.round_trip_time)

Async

async support is available by prepending a to any method.

from pprint import pprint
from humanloop import Humanloop, ApiException

humanloop = Humanloop(
    # Defining the host is optional and defaults to https://api.humanloop.com/v4
    # See configuration.py for a list of all supported configuration parameters.
    host="https://api.humanloop.com/v4",
    # Configure API key authorization: APIKeyHeader
    api_key="YOUR_API_KEY",
    # Uncomment below to setup prefix (e.g. Bearer) for API key, if needed
    # api_key_prefix = {'APIKeyHeader': 'Bearer'},
)


async def main():
    try:
        # Get App
        get_response = await humanloop.apps.aget(
            id="id_example",  # required
        )
        pprint(get_response.headers)
        pprint(get_response.status)
        pprint(get_response.round_trip_time)
    except ApiException as e:
        print("Exception when calling AppsApi.get: %s\n" % e)
        pprint(e.body)
        if e.status == 422:
            pprint(e.body["detail"])
        pprint(e.headers)
        pprint(e.status)
        pprint(e.reason)
        pprint(e.round_trip_time)


asyncio.run(main())

Streaming

Streaming support is available by suffixing a chat or complete method with _stream.

import asyncio
from humanloop import Humanloop

humanloop = Humanloop(
    api_key="YOUR_API_KEY",
)


async def main():
    response = await humanloop.chat_stream(
        project="sdk-example",
        messages=[
            {
                "role": "user",
                "content": "Explain asynchronous programming.",
            }
        ],
        model_config={
            "model": "gpt-3.5-turbo",
            "max_tokens": -1,
            "temperature": 0.7,
            "chat_template": [
                {
                    "role": "system",
                    "content": "You are a helpful assistant who replies in the style of {{persona}}.",
                },
            ],
        },
        inputs={
            "persona": "the pirate Blackbeard",
        },
    )
    async for token in response.content:
        print(token)


asyncio.run(main())

Author

This Python package is automatically generated by Konfig

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

humanloop-0.4.0a4.tar.gz (135.6 kB view details)

Uploaded Source

Built Distribution

humanloop-0.4.0a4-py3-none-any.whl (567.0 kB view details)

Uploaded Python 3

File details

Details for the file humanloop-0.4.0a4.tar.gz.

File metadata

  • Download URL: humanloop-0.4.0a4.tar.gz
  • Upload date:
  • Size: 135.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.16

File hashes

Hashes for humanloop-0.4.0a4.tar.gz
Algorithm Hash digest
SHA256 fdb71bdfb1b2b6c1c64cdb2f9bfec1fcedddc8740367452214ac733211b24afa
MD5 32d918ce67eee0368dd5bb8870999da7
BLAKE2b-256 755b6df87138ed024f46849c1e0ae0c6e40336ec23923d914f4bf76431ac062f

See more details on using hashes here.

File details

Details for the file humanloop-0.4.0a4-py3-none-any.whl.

File metadata

  • Download URL: humanloop-0.4.0a4-py3-none-any.whl
  • Upload date:
  • Size: 567.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.16

File hashes

Hashes for humanloop-0.4.0a4-py3-none-any.whl
Algorithm Hash digest
SHA256 6e39f3b4be851c66c4a684df06eb1d85174b66adce7de1027c5cd21fba9f465c
MD5 c3d9f2deafe6e9435814229574f41a65
BLAKE2b-256 bb7be34001f68346e1fb6ecf1644a1ba94352e5dd081dbce83634537815cd158

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page