Skip to main content

A library containing components related to model inferences in Gen AI applications.

Project description

GLLM Inference

Description

A library containing components related to model inferences in Gen AI applications.

Installation

Prerequisites

1. Installation from Artifact Registry

Choose one of the following methods to install the package:

Using pip

pip install gllm-inference-binary

Using Poetry

poetry add gllm-inference-binary

2. Development Installation (Git)

For development purposes, you can install directly from the Git repository:

poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-inference"

Available extras:

  • anthropic: Install Anthropic models dependencies
  • google-genai: Install Google Generative AI models dependencies
  • google-vertexai: Install Google Vertex AI models dependencies
  • huggingface: Install HuggingFace models dependencies
  • openai: Install OpenAI models dependencies
  • twelvelabs: Install TwelveLabs models dependencies

Managing Dependencies

  1. Go to root folder of gllm-inference module, e.g. cd libs/gllm-inference.
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run poetry update if you update any dependency module version at pyproject.toml.

Contributing

Please refer to this Python Style Guide to get information about code style, documentation standard, and SCA that you need to use when contributing to this project

  1. Activate pre-commit hooks using pre-commit install
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run which python to get the path to be referenced at Visual Studio Code interpreter path (Ctrl+Shift+P or Cmd+Shift+P)
  6. Try running the unit test to see if it's working:
poetry run pytest -s tests/unit_tests/

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

gllm_inference_binary-0.5.10b1-cp313-cp313-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b1-cp312-cp312-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b1-cp311-cp311-manylinux_2_31_x86_64.whl (1.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.31+ x86-64

File details

Details for the file gllm_inference_binary-0.5.10b1-cp313-cp313-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b1-cp313-cp313-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 c418e7e7a07a5ccb14a7bad6fc58b821a1877c9b7cd43578a3bee65c2e575684
MD5 ced3cdb8df62fe82b577a28e85d674df
BLAKE2b-256 ef3fd202b3c316a5efd2c0039fbdb2249bb3d7f3073c709a504d0262e1859e41

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b1-cp312-cp312-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b1-cp312-cp312-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 e312b5397eb2dccc3ac094032b65d308a0328fcc943da19e460cad4b977187f4
MD5 d68b6f0cdf024039d1a803590c6b8143
BLAKE2b-256 6986a05492419b9e888069cb84d94008fac1d2c0574ad6ed1b1d36f7f28431b4

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b1-cp311-cp311-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b1-cp311-cp311-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 a61a74d1a6cdaccc9b21a8391f608063f4d3bf31b749a43ef0b791333bbe3191
MD5 75165974e02b228cbe24cb828ca538d7
BLAKE2b-256 1efd55159def694fc24c27e31626268bace1fddfe2943177b835885454d11fae

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page