Skip to main content

A library containing components related to model inferences in Gen AI applications.

Project description

GLLM Inference

Description

A library containing components related to model inferences in Gen AI applications.

Installation

Prerequisites

1. Installation from Artifact Registry

Choose one of the following methods to install the package:

Using pip

pip install gllm-inference-binary

Using Poetry

poetry add gllm-inference-binary

2. Development Installation (Git)

For development purposes, you can install directly from the Git repository:

poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-inference"

Available extras:

  • anthropic: Install Anthropic models dependencies
  • google-genai: Install Google Generative AI models dependencies
  • google-vertexai: Install Google Vertex AI models dependencies
  • huggingface: Install HuggingFace models dependencies
  • openai: Install OpenAI models dependencies
  • twelvelabs: Install TwelveLabs models dependencies

Managing Dependencies

  1. Go to root folder of gllm-inference module, e.g. cd libs/gllm-inference.
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run poetry update if you update any dependency module version at pyproject.toml.

Contributing

Please refer to this Python Style Guide to get information about code style, documentation standard, and SCA that you need to use when contributing to this project

  1. Activate pre-commit hooks using pre-commit install
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run which python to get the path to be referenced at Visual Studio Code interpreter path (Ctrl+Shift+P or Cmd+Shift+P)
  6. Try running the unit test to see if it's working:
poetry run pytest -s tests/unit_tests/

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

gllm_inference_binary-0.5.10b6-cp313-cp313-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b6-cp312-cp312-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b6-cp311-cp311-manylinux_2_31_x86_64.whl (1.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.31+ x86-64

File details

Details for the file gllm_inference_binary-0.5.10b6-cp313-cp313-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b6-cp313-cp313-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 a2cbb629b85202291ef69ad5fbe7983af1a536543968db5b3007caabd5390e1b
MD5 cd4cbb02bacc8987ea544c2d94ca7ffe
BLAKE2b-256 3d4e55f3ed19c0a0103187048617358926bf1e4d18083312ce7e14772cb04ec2

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b6-cp312-cp312-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b6-cp312-cp312-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 b11b5a107545bf8437543184fa10888eb7aef58c937b4ec9f41e8ea467ad4bc4
MD5 acc448f9cf94685cd66a0d5bd1ede243
BLAKE2b-256 e184f7e1fa21d963aca24f9f2bdd15198e5e781481a05beeef467f51088f793f

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b6-cp311-cp311-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b6-cp311-cp311-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 7968dc1b0d35096f28c85031d411527a936f28b50c9eb532ecaa41dc52e791d6
MD5 2b2015712bd91aae331b5fd6d54bce45
BLAKE2b-256 d30ad97582e13f6c897e2fe0d0c22c558867b9e6ea64c008f96679577ba0cd45

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page