Skip to main content

A library containing components related to model inferences in Gen AI applications.

Project description

GLLM Inference

Description

A library containing components related to model inferences in Gen AI applications.

Installation

Prerequisites

1. Installation from Artifact Registry

Choose one of the following methods to install the package:

Using pip

pip install gllm-inference-binary

Using Poetry

poetry add gllm-inference-binary

2. Development Installation (Git)

For development purposes, you can install directly from the Git repository:

poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-inference"

Available extras:

  • anthropic: Install Anthropic models dependencies
  • google-genai: Install Google Generative AI models dependencies
  • google-vertexai: Install Google Vertex AI models dependencies
  • huggingface: Install HuggingFace models dependencies
  • openai: Install OpenAI models dependencies
  • twelvelabs: Install TwelveLabs models dependencies

Managing Dependencies

  1. Go to root folder of gllm-inference module, e.g. cd libs/gllm-inference.
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run poetry update if you update any dependency module version at pyproject.toml.

Contributing

Please refer to this Python Style Guide to get information about code style, documentation standard, and SCA that you need to use when contributing to this project

  1. Activate pre-commit hooks using pre-commit install
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run which python to get the path to be referenced at Visual Studio Code interpreter path (Ctrl+Shift+P or Cmd+Shift+P)
  6. Try running the unit test to see if it's working:
poetry run pytest -s tests/unit_tests/

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

gllm_inference_binary-0.5.10b3-py3-none-any.whl (102.6 kB view details)

Uploaded Python 3

gllm_inference_binary-0.5.10b3-cp313-cp313-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b3-cp312-cp312-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b3-cp311-cp311-manylinux_2_31_x86_64.whl (1.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.31+ x86-64

File details

Details for the file gllm_inference_binary-0.5.10b3-py3-none-any.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b3-py3-none-any.whl
Algorithm Hash digest
SHA256 9fa091755d66add6b500e6a74d42e773ea827174cdc2b292257f08ca334cf24f
MD5 bfe377b6521e0fddcc1dab2f6e5aec59
BLAKE2b-256 b2c90f25108ed480acb38673db31672c9eb72d60c64effca9589bf5e1601583b

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_inference_binary-0.5.10b3-py3-none-any.whl:

Publisher: build-binary.yml on GDP-ADMIN/gl-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_inference_binary-0.5.10b3-cp313-cp313-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b3-cp313-cp313-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 c998fd3d0fb5130fc592dad020cd37dc5cfce6fbea5d525b451124912039417c
MD5 ff8e062c4ae4f12fd13b5f47e0e4437b
BLAKE2b-256 fbef983ded4972aac738d77ad672e2b648b82d5a69b425b58d263728ad74e524

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b3-cp312-cp312-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b3-cp312-cp312-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 a38979c62ac4203b31c9994d50367dfe1e23c6d3eceb6b1c217f389ce89228af
MD5 5817dbf56995e8cea83b290d3b7404ca
BLAKE2b-256 0df2b23e571c169135dd724e0a9c1b562c399971b5b4d98419c154988935c5d4

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b3-cp311-cp311-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b3-cp311-cp311-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 86ea40e99724b286aa4e48fba4660930794f94445bc740b7dad9c2afd22ae87a
MD5 d241015e64b3b7d464bd6a152d40813e
BLAKE2b-256 bb461ae17fa8e6ef51e3d835487bac57305086c00908ce047cf1e57fd5b372ea

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page