Skip to main content

A library containing components related to model inferences in Gen AI applications.

Project description

GLLM Inference

Description

A library containing components related to model inferences in Gen AI applications.

Installation

Prerequisites

1. Installation from Artifact Registry

Choose one of the following methods to install the package:

Using pip

pip install gllm-inference-binary

Using Poetry

poetry add gllm-inference-binary

2. Development Installation (Git)

For development purposes, you can install directly from the Git repository:

poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-inference"

Available extras:

  • anthropic: Install Anthropic models dependencies
  • google-genai: Install Google Generative AI models dependencies
  • google-vertexai: Install Google Vertex AI models dependencies
  • huggingface: Install HuggingFace models dependencies
  • openai: Install OpenAI models dependencies
  • twelvelabs: Install TwelveLabs models dependencies

Managing Dependencies

  1. Go to root folder of gllm-inference module, e.g. cd libs/gllm-inference.
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run poetry update if you update any dependency module version at pyproject.toml.

Contributing

Please refer to this Python Style Guide to get information about code style, documentation standard, and SCA that you need to use when contributing to this project

  1. Activate pre-commit hooks using pre-commit install
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run which python to get the path to be referenced at Visual Studio Code interpreter path (Ctrl+Shift+P or Cmd+Shift+P)
  6. Try running the unit test to see if it's working:
poetry run pytest -s tests/unit_tests/

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

gllm_inference_binary-0.5.10b5-py3-none-any.whl (102.6 kB view details)

Uploaded Python 3

gllm_inference_binary-0.5.10b5-cp313-cp313-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b5-cp312-cp312-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b5-cp311-cp311-manylinux_2_31_x86_64.whl (1.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.31+ x86-64

File details

Details for the file gllm_inference_binary-0.5.10b5-py3-none-any.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b5-py3-none-any.whl
Algorithm Hash digest
SHA256 fc51a68ced7641ee84333dc7692dccadbe08f678d9cc81e7cc054d2f432eaf7d
MD5 5cd12958d38c4f62b1b3a29c789e6c5a
BLAKE2b-256 5ae9ea15a1751f9a8ff8f01541783b0221033873e2f7f1e4c8ea4094a2f316f2

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_inference_binary-0.5.10b5-py3-none-any.whl:

Publisher: build-binary.yml on GDP-ADMIN/gl-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_inference_binary-0.5.10b5-cp313-cp313-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b5-cp313-cp313-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 e5d66849a20dd4a55f92bd1a1da7ef34160918cedc11e79b9914bfe02de13315
MD5 65015358cb55b13d5ce319e5d98f4394
BLAKE2b-256 3463fd30aeb9fff4100a15984434d94d328898eedebfd14a25e51c3df6a9bab0

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b5-cp312-cp312-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b5-cp312-cp312-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 d1c6b8ebda2369182cfdcd183a51d39635fb5aab9396c1d9f670f7c6e6c12996
MD5 1a5cee995629c6b73bdd02e8f9288827
BLAKE2b-256 abf8c55455dc68aea63ad044e39e57d5061cd4aa43e8eb6fe127dfe789060925

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b5-cp311-cp311-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b5-cp311-cp311-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 8aee0604e51cdfa3bf77de40d516e4e7387b105e629f3f66b6721256e492c253
MD5 c9caf45d30dd862ce6140b4cf257340a
BLAKE2b-256 212faa65043425663584bdc9d0bba7d03387ebed83687768782a3c8894bf5277

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page