Skip to main content

A library containing components related to model inferences in Gen AI applications.

Project description

GLLM Inference

Description

A library containing components related to model inferences in Gen AI applications.

Installation

Prerequisites

1. Installation from Artifact Registry

Choose one of the following methods to install the package:

Using pip

pip install gllm-inference-binary

Using Poetry

poetry add gllm-inference-binary

2. Development Installation (Git)

For development purposes, you can install directly from the Git repository:

poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-inference"

Available extras:

  • anthropic: Install Anthropic models dependencies
  • google-genai: Install Google Generative AI models dependencies
  • google-vertexai: Install Google Vertex AI models dependencies
  • huggingface: Install HuggingFace models dependencies
  • openai: Install OpenAI models dependencies
  • twelvelabs: Install TwelveLabs models dependencies

Managing Dependencies

  1. Go to root folder of gllm-inference module, e.g. cd libs/gllm-inference.
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run poetry update if you update any dependency module version at pyproject.toml.

Contributing

Please refer to this Python Style Guide to get information about code style, documentation standard, and SCA that you need to use when contributing to this project

  1. Activate pre-commit hooks using pre-commit install
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run which python to get the path to be referenced at Visual Studio Code interpreter path (Ctrl+Shift+P or Cmd+Shift+P)
  6. Try running the unit test to see if it's working:
poetry run pytest -s tests/unit_tests/

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

gllm_inference_binary-0.5.26b5-cp313-cp313-manylinux_2_31_x86_64.whl (1.7 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.26b5-cp312-cp312-manylinux_2_31_x86_64.whl (1.7 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.26b5-cp311-cp311-manylinux_2_31_x86_64.whl (1.6 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.31+ x86-64

File details

Details for the file gllm_inference_binary-0.5.26b5-cp313-cp313-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.26b5-cp313-cp313-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 6f6dd362bd132e5e61732edfd1e5ec7ace42c1be03b2eba0466d21c367553719
MD5 bfd0ec94c379805f5d98bed7fd3beebe
BLAKE2b-256 456e22d6474eedb0fbe4883c6164be1237d211e55972d298cab22b638f6d8a05

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.26b5-cp312-cp312-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.26b5-cp312-cp312-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 0be7d085c78eede7c6f0b402fed7794df3379198f152ca4300b85d05b5672969
MD5 fd34902677f2f5cb0fb97b531e698859
BLAKE2b-256 10cbf10e3ba6e480cc343b07a5382164e0843cce72667a77ca3e599ed1968090

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.26b5-cp311-cp311-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.26b5-cp311-cp311-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 38c1449e83220d2445b4cb510b591b5259f8dcf0fc224f1d134bab9d8f331f12
MD5 bebe9e50d9a8a9c101b660948a74f464
BLAKE2b-256 c5fc0040ec21d0eaeb0d691fb7bf26af59b7c68bf630815d1b6011f61b876ff2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page