Skip to main content

A client library for accessing Longship API

Project description

longship-api-client

MIT license PyPI version shields.io Downloads

A client library for accessing Longship API

Usage

First, create a client:

from longship_api_client import Client

client = Client(base_url="https://api.example.com")

If the endpoints you're going to hit require authentication, use AuthenticatedClient instead:

from longship_api_client import AuthenticatedClient

client = AuthenticatedClient(base_url="https://api.example.com", token="SuperSecretToken")

Now call your endpoint and use your models:

from longship_api_client.models import MyDataModel
from longship_api_client.api.my_tag import get_my_data_model
from longship_api_client.types import Response

my_data: MyDataModel = get_my_data_model.sync(client=client)
# or if you need more info (e.g. status_code)
response: Response[MyDataModel] = get_my_data_model.sync_detailed(client=client)

Or do the same thing with an async version:

from longship_api_client.models import MyDataModel
from longship_api_client.api.my_tag import get_my_data_model
from longship_api_client.types import Response

my_data: MyDataModel = await get_my_data_model.asyncio(client=client)
response: Response[MyDataModel] = await get_my_data_model.asyncio_detailed(client=client)

By default, when you're calling an HTTPS API it will attempt to verify that SSL is working correctly. Using certificate verification is highly recommended most of the time, but sometimes you may need to authenticate to a server (especially an internal server) using a custom certificate bundle.

client = AuthenticatedClient(
    base_url="https://internal_api.example.com", 
    token="SuperSecretToken",
    verify_ssl="/path/to/certificate_bundle.pem",
)

You can also disable certificate validation altogether, but beware that this is a security risk.

client = AuthenticatedClient(
    base_url="https://internal_api.example.com", 
    token="SuperSecretToken", 
    verify_ssl=False
)

There are more settings on the generated Client class which let you control more runtime behavior, check out the docstring on that class for more info.

Things to know:

  1. Every path/method combo becomes a Python module with four functions:

    1. sync: Blocking request that returns parsed data (if successful) or None
    2. sync_detailed: Blocking request that always returns a Request, optionally with parsed set if the request was successful.
    3. asyncio: Like sync but async instead of blocking
    4. asyncio_detailed: Like sync_detailed but async instead of blocking
  2. All path/query params, and bodies become method arguments.

  3. If your endpoint had any tags on it, the first tag will be used as a module name for the function (my_tag above)

  4. Any endpoint which did not have a tag will be in longship_api_client.api.default

Building / publishing this Client

This project uses Poetry to manage dependencies and packaging. Here are the basics:

  1. Update the metadata in pyproject.toml (e.g. authors, version)
  2. If you're using a private repository, configure it with Poetry
    1. poetry config repositories.<your-repository-name> <url-to-your-repository>
    2. poetry config http-basic.<your-repository-name> <username> <password>
  3. Publish the client with poetry publish --build -r <your-repository-name> or, if for public PyPI, just poetry publish --build

If you want to install this client into another project without publishing it (e.g. for development) then:

  1. If that project is using Poetry, you can simply do poetry add <path-to-this-client> from that project
  2. If that project is not using Poetry:
    1. Build a wheel with poetry build -f wheel
    2. Install that wheel from the other project pip install <path-to-wheel>

Updating the client

This client was generated with openapi-python-client and to update it to the latest version of the Longship API, you need to download the Swagger configuration to a JSON file and run following in a directory one level above this repo:

openapi-python-client update --path longship-api-client/fixtures/longship_24-06-23.json

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

longship-2024.4.18.tar.gz (114.3 kB view details)

Uploaded Source

Built Distribution

longship-2024.4.18-py3-none-any.whl (326.8 kB view details)

Uploaded Python 3

File details

Details for the file longship-2024.4.18.tar.gz.

File metadata

  • Download URL: longship-2024.4.18.tar.gz
  • Upload date:
  • Size: 114.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/23.4.0

File hashes

Hashes for longship-2024.4.18.tar.gz
Algorithm Hash digest
SHA256 2c382337f014fdb7ef5b5f1a2d8a4e1cc2cef84ea6c3e3738758cf2667d7e7b9
MD5 43b031e91269c83d944d4f77a3b10c11
BLAKE2b-256 b28b082001606f2d2fe3deba21daec4f954343c9b7996ec7dd74014b36f3641b

See more details on using hashes here.

File details

Details for the file longship-2024.4.18-py3-none-any.whl.

File metadata

  • Download URL: longship-2024.4.18-py3-none-any.whl
  • Upload date:
  • Size: 326.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Darwin/23.4.0

File hashes

Hashes for longship-2024.4.18-py3-none-any.whl
Algorithm Hash digest
SHA256 c0239974efb00f8ecfe0d615927caaf93751f595e1248fcb1b214dcdaa1b9557
MD5 72503977c1e5d821d7ddb1e85baece0c
BLAKE2b-256 dc06f86f0fe73c4d310dc117bb922759653616ca5e64b7f08dd7f413c1ba8146

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page