Skip to main content

A library containing components related to model inferences in Gen AI applications.

Project description

GLLM Inference

Description

A library containing components related to model inferences in Gen AI applications.

Installation

Prerequisites

1. Installation from Artifact Registry

Choose one of the following methods to install the package:

Using pip

pip install gllm-inference-binary

Using Poetry

poetry add gllm-inference-binary

2. Development Installation (Git)

For development purposes, you can install directly from the Git repository:

poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-inference"

Available extras:

  • anthropic: Install Anthropic models dependencies
  • google-genai: Install Google Generative AI models dependencies
  • google-vertexai: Install Google Vertex AI models dependencies
  • huggingface: Install HuggingFace models dependencies
  • openai: Install OpenAI models dependencies
  • twelvelabs: Install TwelveLabs models dependencies

Managing Dependencies

  1. Go to root folder of gllm-inference module, e.g. cd libs/gllm-inference.
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run poetry update if you update any dependency module version at pyproject.toml.

Contributing

Please refer to this Python Style Guide to get information about code style, documentation standard, and SCA that you need to use when contributing to this project

  1. Activate pre-commit hooks using pre-commit install
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run which python to get the path to be referenced at Visual Studio Code interpreter path (Ctrl+Shift+P or Cmd+Shift+P)
  6. Try running the unit test to see if it's working:
poetry run pytest -s tests/unit_tests/

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

gllm_inference_binary-0.3.8b1-cp313-cp313-manylinux_2_36_x86_64.whl (1.3 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.36+ x86-64

gllm_inference_binary-0.3.8b1-cp312-cp312-manylinux_2_36_x86_64.whl (1.3 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.36+ x86-64

gllm_inference_binary-0.3.8b1-cp311-cp311-manylinux_2_36_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.36+ x86-64

gllm_inference_binary-0.3.8b1-cp311-cp311-macosx_13_0_arm64.macosx_15_0_arm64.whl (907.4 kB view details)

Uploaded CPython 3.11macOS 13.0+ ARM64macOS 15.0+ ARM64

File details

Details for the file gllm_inference_binary-0.3.8b1-cp313-cp313-manylinux_2_36_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.3.8b1-cp313-cp313-manylinux_2_36_x86_64.whl
Algorithm Hash digest
SHA256 dbbbeb7f076c8582e45d31401758d1a060a8f33441bbc3b9ff5e27ffcd6728c2
MD5 0ce1eedf3e242a9920e08f8a50e8e94b
BLAKE2b-256 83ecfa6cea1e26ff00eb3f5da08b5dffbde6b38aef1479f06a4667bd9387e11e

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.3.8b1-cp312-cp312-manylinux_2_36_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.3.8b1-cp312-cp312-manylinux_2_36_x86_64.whl
Algorithm Hash digest
SHA256 1ce7156679c7c7f9abaf698d2364380cb2a3117bd39dc0d56eeeaf4503f2e745
MD5 330798fbce9f5dd20f8f11bd18edc85e
BLAKE2b-256 79e5391308bc264aa9b6dd5a6af57480f5d0aebefce1d2080c687415cd8e7970

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.3.8b1-cp311-cp311-manylinux_2_36_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.3.8b1-cp311-cp311-manylinux_2_36_x86_64.whl
Algorithm Hash digest
SHA256 bbaf63132c44b555b5251822dcfacebd20bfd12e36c6cec4297564ccb075db2d
MD5 f42df10a54a7c83e2bff6556d3c22fa9
BLAKE2b-256 fa4f1138b3f1c537acfd8d21259551eaecbedd1a4ac37454cefb87409dfdcfbe

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.3.8b1-cp311-cp311-macosx_13_0_arm64.macosx_15_0_arm64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.3.8b1-cp311-cp311-macosx_13_0_arm64.macosx_15_0_arm64.whl
Algorithm Hash digest
SHA256 161e208fe28f7337c5d5d8976776ce36edb56c86d94a9094ded3eb376b967c39
MD5 63d5c63bcc302d2f593968741aaa506c
BLAKE2b-256 6a0b16bc872d2761ff0073e36fb2c36d6076c56126da419a59ed2893b2ccaf5e

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_inference_binary-0.3.8b1-cp311-cp311-macosx_13_0_arm64.macosx_15_0_arm64.whl:

Publisher: build-binary.yml on GDP-ADMIN/gen-ai-internal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page