Skip to main content

A client library for accessing Open Chat API

Project description

open-chat-api-client

A client library for accessing Open Chat API

Usage

First, create a client:

from open_chat_api_client import Client

client = Client(base_url="https://api.example.com")

If the endpoints you're going to hit require authentication, use AuthenticatedClient instead:

from open_chat_api_client import AuthenticatedClient

client = AuthenticatedClient(base_url="https://api.example.com", token="SuperSecretToken")

Now call your endpoint and use your models:

from open_chat_api_client.models import MyDataModel
from open_chat_api_client.api.my_tag import get_my_data_model
from open_chat_api_client.types import Response

with client as client:
    my_data: MyDataModel = get_my_data_model.sync(client=client)
    # or if you need more info (e.g. status_code)
    response: Response[MyDataModel] = get_my_data_model.sync_detailed(client=client)

Or do the same thing with an async version:

from open_chat_api_client.models import MyDataModel
from open_chat_api_client.api.my_tag import get_my_data_model
from open_chat_api_client.types import Response

async with client as client:
    my_data: MyDataModel = await get_my_data_model.asyncio(client=client)
    response: Response[MyDataModel] = await get_my_data_model.asyncio_detailed(client=client)

By default, when you're calling an HTTPS API it will attempt to verify that SSL is working correctly. Using certificate verification is highly recommended most of the time, but sometimes you may need to authenticate to a server (especially an internal server) using a custom certificate bundle.

client = AuthenticatedClient(
    base_url="https://internal_api.example.com", 
    token="SuperSecretToken",
    verify_ssl="/path/to/certificate_bundle.pem",
)

You can also disable certificate validation altogether, but beware that this is a security risk.

client = AuthenticatedClient(
    base_url="https://internal_api.example.com", 
    token="SuperSecretToken", 
    verify_ssl=False
)

Things to know:

  1. Every path/method combo becomes a Python module with four functions:

    1. sync: Blocking request that returns parsed data (if successful) or None
    2. sync_detailed: Blocking request that always returns a Request, optionally with parsed set if the request was successful.
    3. asyncio: Like sync but async instead of blocking
    4. asyncio_detailed: Like sync_detailed but async instead of blocking
  2. All path/query params, and bodies become method arguments.

  3. If your endpoint had any tags on it, the first tag will be used as a module name for the function (my_tag above)

  4. Any endpoint which did not have a tag will be in open_chat_api_client.api.default

Advanced customizations

There are more settings on the generated Client class which let you control more runtime behavior, check out the docstring on that class for more info. You can also customize the underlying httpx.Client or httpx.AsyncClient (depending on your use-case):

from open_chat_api_client import Client

def log_request(request):
    print(f"Request event hook: {request.method} {request.url} - Waiting for response")

def log_response(response):
    request = response.request
    print(f"Response event hook: {request.method} {request.url} - Status {response.status_code}")

client = Client(
    base_url="https://api.example.com",
    httpx_args={"event_hooks": {"request": [log_request], "response": [log_response]}},
)

# Or get the underlying httpx client to modify directly with client.get_httpx_client() or client.get_async_httpx_client()

You can even set the httpx client directly, but beware that this will override any existing settings (e.g., base_url):

import httpx
from open_chat_api_client import Client

client = Client(
    base_url="https://api.example.com",
)
# Note that base_url needs to be re-set, as would any shared cookies, headers, etc.
client.set_httpx_client(httpx.Client(base_url="https://api.example.com", proxies="http://localhost:8030"))

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

open_chat_api_client-1.0.0.tar.gz (20.2 kB view details)

Uploaded Source

Built Distribution

open_chat_api_client-1.0.0-py3-none-any.whl (56.5 kB view details)

Uploaded Python 3

File details

Details for the file open_chat_api_client-1.0.0.tar.gz.

File metadata

  • Download URL: open_chat_api_client-1.0.0.tar.gz
  • Upload date:
  • Size: 20.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.6

File hashes

Hashes for open_chat_api_client-1.0.0.tar.gz
Algorithm Hash digest
SHA256 47805efa9755d8e0b8227912e93d50f44f75d8be93011b04a9f0a6791b135562
MD5 a95e8b57ec3ce9ca16a7e19005f39cc5
BLAKE2b-256 f1aee95954b3b6617f6e598e994cabd94ebbde36f3ef90b74dcd481ba2150eb7

See more details on using hashes here.

File details

Details for the file open_chat_api_client-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for open_chat_api_client-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 4aac8534ebacaf686433611d7be9a403f82382c4e718dba383b85fa4211b485c
MD5 e366891fdad614e20e1baf4051185985
BLAKE2b-256 ac39d5f6d3429f306f7f3da3361393db9b08607c3d4e9e089f3473ba3e55de2e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page