Skip to main content

A library containing components related to model inferences in Gen AI applications.

Project description

GLLM Inference

Description

A library containing components related to model inferences in Gen AI applications.

Installation

Prerequisites

1. Installation from Artifact Registry

Choose one of the following methods to install the package:

Using pip

pip install gllm-inference-binary

Using Poetry

poetry add gllm-inference-binary

2. Development Installation (Git)

For development purposes, you can install directly from the Git repository:

poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-inference"

Available extras:

  • anthropic: Install Anthropic models dependencies
  • google-genai: Install Google Generative AI models dependencies
  • google-vertexai: Install Google Vertex AI models dependencies
  • huggingface: Install HuggingFace models dependencies
  • openai: Install OpenAI models dependencies
  • twelvelabs: Install TwelveLabs models dependencies

Managing Dependencies

  1. Go to root folder of gllm-inference module, e.g. cd libs/gllm-inference.
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run poetry update if you update any dependency module version at pyproject.toml.

Contributing

Please refer to this Python Style Guide to get information about code style, documentation standard, and SCA that you need to use when contributing to this project

  1. Activate pre-commit hooks using pre-commit install
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run which python to get the path to be referenced at Visual Studio Code interpreter path (Ctrl+Shift+P or Cmd+Shift+P)
  6. Try running the unit test to see if it's working:
poetry run pytest -s tests/unit_tests/

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

gllm_inference_binary-0.5.10b7-py3-none-any.whl (102.6 kB view details)

Uploaded Python 3

gllm_inference_binary-0.5.10b7-cp313-cp313-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b7-cp312-cp312-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b7-cp311-cp311-manylinux_2_31_x86_64.whl (1.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.31+ x86-64

File details

Details for the file gllm_inference_binary-0.5.10b7-py3-none-any.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b7-py3-none-any.whl
Algorithm Hash digest
SHA256 0a18a9ca1a505a52dfc34b9f9ffceb861f76b5f84b532a592da326388a08f6c9
MD5 95adc9a3db685211eae0b63d763e0472
BLAKE2b-256 f88c1f179346c4b24a02c4f0b591a65a6c82c1874ffe63a82cb4db355bd745cf

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_inference_binary-0.5.10b7-py3-none-any.whl:

Publisher: build-binary.yml on GDP-ADMIN/gl-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_inference_binary-0.5.10b7-cp313-cp313-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b7-cp313-cp313-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 3692dcf92dcb186615c3d8b495a00823fb5b73d977ea29b1d20965fcfa7acb3c
MD5 f788a393eee6b261d3e6a6a8bdffc522
BLAKE2b-256 2570d681574b686b6447cbb5468e61392be3f654521afcb30efb04430065a2ff

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b7-cp312-cp312-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b7-cp312-cp312-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 e00ad6c390a4ad8f030168c57a54f8a6aaafe3d17ef01e340f32c2886327c92f
MD5 a64fcafd0754f7db3e8562a6f9e90faf
BLAKE2b-256 f61b8040f4872474ad00376098d6f3b9282af2252dbb11447b2727ea330e4c3f

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b7-cp311-cp311-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b7-cp311-cp311-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 47930c7e35cf85d3526067133d017d1e47eff5730d6dd369b9ea6557e51720ac
MD5 c76c66d771e59e012987c9cb86b01144
BLAKE2b-256 f263213802ab304143181d41b4bce5abd3d134e757259e43c2a4e35216b60aa9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page