Skip to main content

A client library for accessing Longship API

Project description

longship-api-client

MIT license PyPI version shields.io Downloads

A client library for accessing Longship API

Usage

First, create a client:

from longship_api_client import Client

client = Client(base_url="https://api.example.com")

If the endpoints you're going to hit require authentication, use AuthenticatedClient instead:

from longship_api_client import AuthenticatedClient

client = AuthenticatedClient(base_url="https://api.example.com", token="SuperSecretToken")

Now call your endpoint and use your models:

from longship_api_client.models import MyDataModel
from longship_api_client.api.my_tag import get_my_data_model
from longship_api_client.types import Response

my_data: MyDataModel = get_my_data_model.sync(client=client)
# or if you need more info (e.g. status_code)
response: Response[MyDataModel] = get_my_data_model.sync_detailed(client=client)

Or do the same thing with an async version:

from longship_api_client.models import MyDataModel
from longship_api_client.api.my_tag import get_my_data_model
from longship_api_client.types import Response

my_data: MyDataModel = await get_my_data_model.asyncio(client=client)
response: Response[MyDataModel] = await get_my_data_model.asyncio_detailed(client=client)

By default, when you're calling an HTTPS API it will attempt to verify that SSL is working correctly. Using certificate verification is highly recommended most of the time, but sometimes you may need to authenticate to a server (especially an internal server) using a custom certificate bundle.

client = AuthenticatedClient(
    base_url="https://internal_api.example.com", 
    token="SuperSecretToken",
    verify_ssl="/path/to/certificate_bundle.pem",
)

You can also disable certificate validation altogether, but beware that this is a security risk.

client = AuthenticatedClient(
    base_url="https://internal_api.example.com", 
    token="SuperSecretToken", 
    verify_ssl=False
)

There are more settings on the generated Client class which let you control more runtime behavior, check out the docstring on that class for more info.

Things to know:

  1. Every path/method combo becomes a Python module with four functions:

    1. sync: Blocking request that returns parsed data (if successful) or None
    2. sync_detailed: Blocking request that always returns a Request, optionally with parsed set if the request was successful.
    3. asyncio: Like sync but async instead of blocking
    4. asyncio_detailed: Like sync_detailed but async instead of blocking
  2. All path/query params, and bodies become method arguments.

  3. If your endpoint had any tags on it, the first tag will be used as a module name for the function (my_tag above)

  4. Any endpoint which did not have a tag will be in longship_api_client.api.default

Building / publishing this Client

This project uses Poetry to manage dependencies and packaging. Here are the basics:

  1. Update the metadata in pyproject.toml (e.g. authors, version)
  2. If you're using a private repository, configure it with Poetry
    1. poetry config repositories.<your-repository-name> <url-to-your-repository>
    2. poetry config http-basic.<your-repository-name> <username> <password>
  3. Publish the client with poetry publish --build -r <your-repository-name> or, if for public PyPI, just poetry publish --build

If you want to install this client into another project without publishing it (e.g. for development) then:

  1. If that project is using Poetry, you can simply do poetry add <path-to-this-client> from that project
  2. If that project is not using Poetry:
    1. Build a wheel with poetry build -f wheel
    2. Install that wheel from the other project pip install <path-to-wheel>

Updating the client

This client was generated with openapi-python-client and to update it to the latest version of the Longship API, you need to download the Swagger configuration to a JSON file and run following in a directory one level above this repo:

openapi-python-client generate --path longship-api-client/fixtures/longship_24-06-23.json

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

longship-2025.11.12.tar.gz (124.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

longship-2025.11.12-py3-none-any.whl (398.8 kB view details)

Uploaded Python 3

File details

Details for the file longship-2025.11.12.tar.gz.

File metadata

  • Download URL: longship-2025.11.12.tar.gz
  • Upload date:
  • Size: 124.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.0 CPython/3.13.9 Darwin/25.0.0

File hashes

Hashes for longship-2025.11.12.tar.gz
Algorithm Hash digest
SHA256 f334346b555a8c4b41799e0e18371c22a86ca5049a6c244e8bfa7f4cc4f701b8
MD5 e3f50a5a5cd64260b6d6ec677ac62ccf
BLAKE2b-256 d14deda9047d51da6db39b3db5285b1e3efa2563caa47924af60a647464b92a9

See more details on using hashes here.

File details

Details for the file longship-2025.11.12-py3-none-any.whl.

File metadata

  • Download URL: longship-2025.11.12-py3-none-any.whl
  • Upload date:
  • Size: 398.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.0 CPython/3.13.9 Darwin/25.0.0

File hashes

Hashes for longship-2025.11.12-py3-none-any.whl
Algorithm Hash digest
SHA256 bcd22fc1d3a0e09d2c0441a1689fd32daff41e6510fcba42431b2c4105d135f4
MD5 025105714adea5d93f635db1e2612ab7
BLAKE2b-256 3f47ce4d85910f5bcbb5ffa5ee56a81afe2c3d59617e5244bf7825a74bbc83c7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page