Skip to main content

Client library for Enterprise h2oGPTe

Project description

Python Client and Documentation

We recommend installing the client with the same version as the software:

pip install h2ogpte

API Keys and Python Client Examples

API keys are needed to programmatically connect to h2oGPTe from the Python client.

There are two kinds of API keys:

  • Global API key allows a client to impersonate your user for all API calls.
  • Collection-specific API keys allows a client to chat with your specific collection.

Global API keys

If a collection is not specified when creating a new API key, that key is considered to be a global API key. Use global API keys to grant full user impersonation and system-wide access to all of your work. Anyone with access to one of your global API keys can create, delete, or interact with any of your past, current, and future collections, documents, chats, and settings. The GUI offers an Impersonate feature under the user settings.

from h2ogpte import H2OGPTE

client = H2OGPTE(
    address='https://h2ogpte.genai.h2o.ai',
    api_key='sk-XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX',
)

# Create a new collection
collection_id = client.create_collection(
    name='Contracts',
    description='Paper clip supply contracts',
)

# Create documents
# Note: Done for demonstration purposes only (not usually needed)
with open('dunder_mifflin.txt', 'w') as f:
    f.write('There were 55 paper clips shipped, 22 to Scranton and 33 to Filmer.')

with open('initech.txt', 'w') as f:
    f.write('David Brent did not sign any contract with Initech.')

# Upload documents
# Many file types are supported: text/image/audio documents and archives
with open('dunder_mifflin.txt', 'rb') as f:
    dunder_mifflin = client.upload('Dunder Mifflin.txt', f)

with open('initech.txt', 'rb') as f:
    initech = client.upload('IniTech.txt', f)

# Ingest documents (Creates previews, chunks and embeddings)
client.ingest_uploads(collection_id, [dunder_mifflin, initech])

# Create a chat session
chat_session_id = client.create_chat_session(collection_id)

# Query the collection
with client.connect(chat_session_id) as session:
    reply = session.query(
        'How many paper clips were shipped to Scranton?',
        timeout=60,
    )
    print(reply.content)

    reply = session.query(
        'Did David Brent co-sign the contract with Initech?',
        timeout=60,
    )
    print(reply.content)

    # In case have multiple LLMs, route to LLM with best
    # price/performance below given max cost
    reply = session.query(
        'Did David Brent co-sign the contract with Initech?',
        llm='auto',
        llm_args=dict(cost_controls=dict(max_cost=1e-2)),
        timeout=60,
    )
    print(reply.content)

    # Classification
    reply = session.query(
        'Did David Brent co-sign the contract with Initech?',
        llm_args=dict(
            guided_choice=['yes', 'no', 'unclear'],
        ),
        timeout=60,
    )
    print(reply.content)

    # Create custom JSON
    reply = session.query(
        'How many paper clips were shipped to Scranton?',
        llm_args=dict(
            response_format='json_object',
            guided_json={
                '$schema': 'http://json-schema.org/draft-07/schema#',
                'type': 'object',
                'properties': {'count': {'type': 'integer'}},
                'required': [
                    'count',
                ],
            },
        ),
        timeout=60,
    )
    print(reply.content)

    # Force multimodal vision mode (requires vision-capable LLMs)
    reply = session.query(
        'How many paper clips were shipped to Scranton?',
        llm_args=dict(
            enable_vision='on',
        ),
        timeout=60,
    )
    print(reply.content)

# Summarize each document
documents = client.list_documents_in_collection(collection_id, offset=0, limit=99)
for doc in documents:
    summary = client.process_document(
        document_id=doc.id,
        pre_prompt_summary='Pay attention to the following text in order to summarize.',
        prompt_summary='Write a concise summary from the text above.',
        timeout=60,
    )
    print(summary.content)

# Chat with LLM without a collection
chat_session_id = client.create_chat_session()

with client.connect(chat_session_id) as session:
    reply = session.query(
        'Why is drinking water good for you?',
        timeout=60,
    )
    print(reply.content)

Collection-specific API keys

Use collection-specific API keys to grant external access to only chat with the specified collection and make related API calls. Collection-specific API keys do not allow any other API calls such as creation, deletion, or access to other collections or chats.

from h2ogpte import H2OGPTE

client = H2OGPTE(
    address='https://h2ogpte.genai.h2o.ai',
    api_key='sk-XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX',
)

# Automatically connects to the collection from the
# collection-specific API key
chat_session_id = client.create_chat_session_on_default_collection()

# Query the collection
with client.connect(chat_session_id) as session:
    reply = session.query(
        'How many paper clips were shipped to Scranton?',
        timeout=60,
    )
    print(reply.content)

    reply = session.query(
        'Did David Brent co-sign the contract with Initech?',
        timeout=60,
    )
    print(reply.content)

# Summarize each document
default_collection = client.get_default_collection()
documents = client.list_documents_in_collection(default_collection.id, offset=0, limit=99)
for doc in documents:
    summary = client.summarize_document(
        document_id=doc.id,
        timeout=60,
    )
    print(summary.content)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

h2ogpte-1.5.16.post2-py3-none-any.whl (67.0 kB view details)

Uploaded Python 3

File details

Details for the file h2ogpte-1.5.16.post2-py3-none-any.whl.

File metadata

File hashes

Hashes for h2ogpte-1.5.16.post2-py3-none-any.whl
Algorithm Hash digest
SHA256 0a0340a40d40cee86f07d1d36fe42d68ede031f2e07da69eee6c16c8fa96cc6d
MD5 0ac106f0d7b24a84f4016b8e1e73e885
BLAKE2b-256 b12916c0c32967c1a831daae08cc57e51a757773c3f8d54d86b9cf1222183522

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page