Skip to main content

A client library for accessing Longship API

Project description

longship-api-client

MIT license PyPI version shields.io Downloads

A client library for accessing Longship API

Usage

First, create a client:

from longship_api_client import Client

client = Client(base_url="https://api.example.com")

If the endpoints you're going to hit require authentication, use AuthenticatedClient instead:

from longship_api_client import AuthenticatedClient

client = AuthenticatedClient(base_url="https://api.example.com", token="SuperSecretToken")

Now call your endpoint and use your models:

from longship_api_client.models import MyDataModel
from longship_api_client.api.my_tag import get_my_data_model
from longship_api_client.types import Response

my_data: MyDataModel = get_my_data_model.sync(client=client)
# or if you need more info (e.g. status_code)
response: Response[MyDataModel] = get_my_data_model.sync_detailed(client=client)

Or do the same thing with an async version:

from longship_api_client.models import MyDataModel
from longship_api_client.api.my_tag import get_my_data_model
from longship_api_client.types import Response

my_data: MyDataModel = await get_my_data_model.asyncio(client=client)
response: Response[MyDataModel] = await get_my_data_model.asyncio_detailed(client=client)

By default, when you're calling an HTTPS API it will attempt to verify that SSL is working correctly. Using certificate verification is highly recommended most of the time, but sometimes you may need to authenticate to a server (especially an internal server) using a custom certificate bundle.

client = AuthenticatedClient(
    base_url="https://internal_api.example.com", 
    token="SuperSecretToken",
    verify_ssl="/path/to/certificate_bundle.pem",
)

You can also disable certificate validation altogether, but beware that this is a security risk.

client = AuthenticatedClient(
    base_url="https://internal_api.example.com", 
    token="SuperSecretToken", 
    verify_ssl=False
)

There are more settings on the generated Client class which let you control more runtime behavior, check out the docstring on that class for more info.

Things to know:

  1. Every path/method combo becomes a Python module with four functions:

    1. sync: Blocking request that returns parsed data (if successful) or None
    2. sync_detailed: Blocking request that always returns a Request, optionally with parsed set if the request was successful.
    3. asyncio: Like sync but async instead of blocking
    4. asyncio_detailed: Like sync_detailed but async instead of blocking
  2. All path/query params, and bodies become method arguments.

  3. If your endpoint had any tags on it, the first tag will be used as a module name for the function (my_tag above)

  4. Any endpoint which did not have a tag will be in longship_api_client.api.default

Building / publishing this Client

This project uses Poetry to manage dependencies and packaging. Here are the basics:

  1. Update the metadata in pyproject.toml (e.g. authors, version)
  2. If you're using a private repository, configure it with Poetry
    1. poetry config repositories.<your-repository-name> <url-to-your-repository>
    2. poetry config http-basic.<your-repository-name> <username> <password>
  3. Publish the client with poetry publish --build -r <your-repository-name> or, if for public PyPI, just poetry publish --build

If you want to install this client into another project without publishing it (e.g. for development) then:

  1. If that project is using Poetry, you can simply do poetry add <path-to-this-client> from that project
  2. If that project is not using Poetry:
    1. Build a wheel with poetry build -f wheel
    2. Install that wheel from the other project pip install <path-to-wheel>

Updating the client

This client was generated with openapi-python-client and to update it to the latest version of the Longship API, you need to download the Swagger configuration to a JSON file and run following in a directory one level above this repo:

openapi-python-client update --path longship-api-client/fixtures/longship_24-06-23.json

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

longship-2024.4.17.tar.gz (114.3 kB view details)

Uploaded Source

Built Distribution

longship-2024.4.17-py3-none-any.whl (326.8 kB view details)

Uploaded Python 3

File details

Details for the file longship-2024.4.17.tar.gz.

File metadata

  • Download URL: longship-2024.4.17.tar.gz
  • Upload date:
  • Size: 114.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/23.4.0

File hashes

Hashes for longship-2024.4.17.tar.gz
Algorithm Hash digest
SHA256 46827e34d152139a300ff3c1d1f38668bca2f84a21abeb8764de41b6ba15ba5c
MD5 ad4445317328f83ad9abe024c9b2b9fb
BLAKE2b-256 260f1cac8bf88515e9f21306236ad2969033411aa2d57b2028ef7a17f5c26cbf

See more details on using hashes here.

File details

Details for the file longship-2024.4.17-py3-none-any.whl.

File metadata

  • Download URL: longship-2024.4.17-py3-none-any.whl
  • Upload date:
  • Size: 326.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/23.4.0

File hashes

Hashes for longship-2024.4.17-py3-none-any.whl
Algorithm Hash digest
SHA256 a889ef4eff2bc7536e2e02508f625516ca7a57bd810f295ee27aaed044ab01e9
MD5 5ca7b592c29e4c65d6915fb4cd72f272
BLAKE2b-256 a53728499ae3011193de4703e165bc8cf0c3a55a0e92dd92c1038c9de819168c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page