Skip to main content

A library containing components related to model inferences in Gen AI applications.

Project description

GLLM Inference

Description

A library containing components related to model inferences in Gen AI applications.

Installation

Prerequisites

1. Installation from Artifact Registry

Choose one of the following methods to install the package:

Using pip

pip install gllm-inference-binary

Using Poetry

poetry add gllm-inference-binary

2. Development Installation (Git)

For development purposes, you can install directly from the Git repository:

poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-inference"

Available extras:

  • anthropic: Install Anthropic models dependencies
  • google-genai: Install Google Generative AI models dependencies
  • google-vertexai: Install Google Vertex AI models dependencies
  • huggingface: Install HuggingFace models dependencies
  • openai: Install OpenAI models dependencies
  • twelvelabs: Install TwelveLabs models dependencies

Managing Dependencies

  1. Go to root folder of gllm-inference module, e.g. cd libs/gllm-inference.
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run poetry update if you update any dependency module version at pyproject.toml.

Contributing

Please refer to this Python Style Guide to get information about code style, documentation standard, and SCA that you need to use when contributing to this project

  1. Activate pre-commit hooks using pre-commit install
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run which python to get the path to be referenced at Visual Studio Code interpreter path (Ctrl+Shift+P or Cmd+Shift+P)
  6. Try running the unit test to see if it's working:
poetry run pytest -s tests/unit_tests/

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

gllm_inference_binary-0.3.0b1-cp313-cp313-manylinux_2_31_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.3.0b1-cp312-cp312-manylinux_2_31_x86_64.whl (1.2 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.3.0b1-cp311-cp311-manylinux_2_31_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.31+ x86-64

File details

Details for the file gllm_inference_binary-0.3.0b1-cp313-cp313-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.3.0b1-cp313-cp313-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 5278c745a0a483f33dad5d3d2b1235d9412ff48e0093affb11d9ac92a3f179f6
MD5 e7262a91ed666b9b4f600094a7531d3c
BLAKE2b-256 cdd013222559a5a5012a936bd9dc245bb9d3db26da0baffa4a5844bb9a0cc831

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.3.0b1-cp312-cp312-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.3.0b1-cp312-cp312-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 222151bd7bd863eebbf2536adab2690f59df30c0461dbd9e674025111a3be65b
MD5 577b8384eecec91e7e7dc0d17a582bf7
BLAKE2b-256 c9ffd0ca04bb565608bffc94e9ed006d2ae0c32e8f1e0bf88c689c6ae43a22e8

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.3.0b1-cp311-cp311-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.3.0b1-cp311-cp311-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 01eb71f4650d4644837f873713012bf343e8c0add5231a304b40cc408efac605
MD5 7721fc9d4821d38cb414bf9049705516
BLAKE2b-256 1d84ccc16f5e27cf02a29d6a6560d494408a8608d8ead1b6db276d35658ee80d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page