Skip to main content

A library containing components related to model inferences in Gen AI applications.

Project description

GLLM Inference

Description

A library containing components related to model inferences in Gen AI applications.

Installation

Prerequisites

1. Installation from Artifact Registry

Choose one of the following methods to install the package:

Using pip

pip install gllm-inference-binary

Using Poetry

poetry add gllm-inference-binary

2. Development Installation (Git)

For development purposes, you can install directly from the Git repository:

poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-inference"

Available extras:

  • anthropic: Install Anthropic models dependencies
  • google-genai: Install Google Generative AI models dependencies
  • google-vertexai: Install Google Vertex AI models dependencies
  • huggingface: Install HuggingFace models dependencies
  • openai: Install OpenAI models dependencies
  • twelvelabs: Install TwelveLabs models dependencies

Managing Dependencies

  1. Go to root folder of gllm-inference module, e.g. cd libs/gllm-inference.
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run poetry update if you update any dependency module version at pyproject.toml.

Contributing

Please refer to this Python Style Guide to get information about code style, documentation standard, and SCA that you need to use when contributing to this project

  1. Activate pre-commit hooks using pre-commit install
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run which python to get the path to be referenced at Visual Studio Code interpreter path (Ctrl+Shift+P or Cmd+Shift+P)
  6. Try running the unit test to see if it's working:
poetry run pytest -s tests/unit_tests/

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

gllm_inference_binary-0.5.10b4-py3-none-any.whl (102.6 kB view details)

Uploaded Python 3

gllm_inference_binary-0.5.10b4-cp313-cp313-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b4-cp312-cp312-manylinux_2_31_x86_64.whl (1.5 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.5.10b4-cp311-cp311-manylinux_2_31_x86_64.whl (1.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.31+ x86-64

File details

Details for the file gllm_inference_binary-0.5.10b4-py3-none-any.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b4-py3-none-any.whl
Algorithm Hash digest
SHA256 3389a3658d6a00b818f54c328f0f53abdbd054d5559e185d64ab0182b83eefb3
MD5 510ce04831184c3c8fa1c242c944b8f4
BLAKE2b-256 84dbf54b80386b27c4d2ac14bb8cfa5e3be765e5a0ded6a08f05aa4ce8dc2a23

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_inference_binary-0.5.10b4-py3-none-any.whl:

Publisher: build-binary.yml on GDP-ADMIN/gl-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_inference_binary-0.5.10b4-cp313-cp313-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b4-cp313-cp313-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 83cad2e8391e78a5ea50be016a2a9a6395265d37956eae168157210c0b9a3ae2
MD5 f5481e6c90047886a47dd785cecf8cfa
BLAKE2b-256 0aa4a5a264ee9e4e87ec591919b31dd9007b59438f500ca96e941459cc493e16

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b4-cp312-cp312-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b4-cp312-cp312-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 f3a94944d5c2705a41ad6a31b55c2c255e0d15312511fabc2171d4e0ac386899
MD5 297de84485f6bc7b301b2e2997128c18
BLAKE2b-256 017caece8c1a019097a968b477419b6250a2d07e24e0344fed5c915da5e1ee88

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.5.10b4-cp311-cp311-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.5.10b4-cp311-cp311-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 322b4320c859a1630993c7c65139042624ced94539025209d8825d1df30a668e
MD5 af9a61a3337ca9809662b9901574b213
BLAKE2b-256 10115a6026f45cbf3c3d10445a300391dc1f0d60e2b7dc9e469951d1c69a0eb4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page