Skip to main content

A library containing components related to model inferences in Gen AI applications.

Project description

GLLM Inference

Description

A library containing components related to model inferences in Gen AI applications.

Installation

  1. Python v3.11 or above:

You can install Python using Miniconda.

  1. Make sure you're in the base conda environment:
conda activate
  1. Poetry v1.8.1 or above:

You can install Poetry using cURL (you need Python to install Poetry):

curl -sSL https://install.python-poetry.org | python3 -
  1. Install the library using Poetry:
# Latest
poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-inference"

# Specific version
poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git@gllm_inference-v0.0.1-beta.1#subdirectory=libs/gllm-inference"

# Specific Branch Name
poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git@<BRANCH NAME>#subdirectory=libs/gllm-inference"

# With extra dependencies
poetry add "git+ssh://git@github.com/GDP-ADMIN/gen-ai-internal.git#subdirectory=libs/gllm-inference" --extras "extra1 extra2"

Available extras:

  • anthropic: Install Anthropic models dependencies
  • google-genai: Install Google Generative AI models dependencies
  • google-vertexai: Install Google Vertex AI models dependencies
  • huggingface: Install HuggingFace models dependencies
  • openai: Install OpenAI models dependencies
  • twelvelabs: Install TwelveLabs models dependencies
  1. At this step, you can deactivate Miniconda environment as Poetry will create and manage its own virtual environment for you.
conda deactivate

Managing Dependencies

  1. Go to root folder of gllm-inference module, e.g. cd libs/gllm-inference.
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run poetry update if you update any dependency module version at pyproject.toml.

Contributing

Please refer to this Python Style Guide to get information about code style, documentation standard, and SCA that you need to use when contributing to this project

  1. Activate pre-commit hooks using pre-commit install
  2. Run poetry shell to create a virtual environment.
  3. Run poetry lock to create a lock file if you haven't done it yet.
  4. Run poetry install to install the gllm-inference requirements for the first time.
  5. Run which python to get the path to be referenced at Visual Studio Code interpreter path (Ctrl+Shift+P or Cmd+Shift+P)
  6. Try running the unit test to see if it's working:
poetry run pytest -s tests/unit_tests/

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

gllm_inference_binary-0.2.38-cp312-cp312-win_amd64.whl (723.8 kB view details)

Uploaded CPython 3.12Windows x86-64

gllm_inference_binary-0.2.38-cp312-cp312-manylinux_2_31_x86_64.whl (1.0 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.2.38-cp312-cp312-macosx_14_0_arm64.whl (782.7 kB view details)

Uploaded CPython 3.12macOS 14.0+ ARM64

gllm_inference_binary-0.2.38-cp312-cp312-macosx_13_0_x86_64.whl (869.3 kB view details)

Uploaded CPython 3.12macOS 13.0+ x86-64

gllm_inference_binary-0.2.38-cp311-cp311-win_amd64.whl (727.3 kB view details)

Uploaded CPython 3.11Windows x86-64

gllm_inference_binary-0.2.38-cp311-cp311-manylinux_2_31_x86_64.whl (964.7 kB view details)

Uploaded CPython 3.11manylinux: glibc 2.31+ x86-64

gllm_inference_binary-0.2.38-cp311-cp311-macosx_14_0_arm64.whl (776.2 kB view details)

Uploaded CPython 3.11macOS 14.0+ ARM64

gllm_inference_binary-0.2.38-cp311-cp311-macosx_13_0_x86_64.whl (862.4 kB view details)

Uploaded CPython 3.11macOS 13.0+ x86-64

File details

Details for the file gllm_inference_binary-0.2.38-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.2.38-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 3f26ceb5464002bf9c8f330dcb5b9ec92919755560105f15f8b046799ec17b1e
MD5 f17076772b13ffc460d6a9f85fc05363
BLAKE2b-256 6147a74ee6502702b92c3af3e98942378413fa2502ca2ca2b9c8a8269d85e399

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_inference_binary-0.2.38-cp312-cp312-win_amd64.whl:

Publisher: build-binary.yml on GDP-ADMIN/gen-ai-internal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_inference_binary-0.2.38-cp312-cp312-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.2.38-cp312-cp312-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 98b585f5f5961e0d3bed032fef0a6d06baf28c36b0bd393a2e35b565b8abc144
MD5 076ada0a3ff0969c6d9a0446a728a0ad
BLAKE2b-256 d997d4838fa71745917cd4c392d1e21881bde5a78dbdd890f287031e3e29eabd

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.2.38-cp312-cp312-macosx_14_0_arm64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.2.38-cp312-cp312-macosx_14_0_arm64.whl
Algorithm Hash digest
SHA256 e379f87d55385577ca54719f1314361b079deed83c8387fa38d9b67877eff623
MD5 f8b7a674a139ad3fd682d5a6eec2bf18
BLAKE2b-256 815ac3fa751cb3544ec086f92ff01fbf5c03434dcafb20f24bcc80c92e24ee7f

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_inference_binary-0.2.38-cp312-cp312-macosx_14_0_arm64.whl:

Publisher: build-binary.yml on GDP-ADMIN/gen-ai-internal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_inference_binary-0.2.38-cp312-cp312-macosx_13_0_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.2.38-cp312-cp312-macosx_13_0_x86_64.whl
Algorithm Hash digest
SHA256 b969ef79e5ff149597bcf1ce4145c2bce2b6c482f14c320d924d7e91837b4ece
MD5 33b0e83e039b673376f0f2a03af506ff
BLAKE2b-256 874ea6851445f8e17fb94318ec23f41b5bbacc93ebc88fefbe6d7a1062a4d4fa

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_inference_binary-0.2.38-cp312-cp312-macosx_13_0_x86_64.whl:

Publisher: build-binary.yml on GDP-ADMIN/gen-ai-internal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_inference_binary-0.2.38-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.2.38-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 2d53b46c4cff5a369237825006210d0ca2c3c3f159a5cd6e21b8acd9d251350d
MD5 e0d0a7a788b3bcb3b8e7a73cae7f0a5a
BLAKE2b-256 50275d579e5796500ab0eaf5fbdc518e2910667737377e59a43c32a30aee7ff3

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_inference_binary-0.2.38-cp311-cp311-win_amd64.whl:

Publisher: build-binary.yml on GDP-ADMIN/gen-ai-internal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_inference_binary-0.2.38-cp311-cp311-manylinux_2_31_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.2.38-cp311-cp311-manylinux_2_31_x86_64.whl
Algorithm Hash digest
SHA256 921535dcd66f1e81ee05bcb5decaa5b31db80de6e9e4e25f07a20f29fdbbe213
MD5 0c6803677bd729b22aecc6c935521508
BLAKE2b-256 4d4505cd75f33c31c2a12db65762160fb3719672e054aacdbaad1dea5ada82dd

See more details on using hashes here.

File details

Details for the file gllm_inference_binary-0.2.38-cp311-cp311-macosx_14_0_arm64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.2.38-cp311-cp311-macosx_14_0_arm64.whl
Algorithm Hash digest
SHA256 6a748225f42e8da797879e40ae0f4b2287e2c9c05abb997b2be0410d203bfc5e
MD5 afcdd2e81adc5ab905339c3e5e6358c7
BLAKE2b-256 701e7b2e69e98a044aece6be7a3c7901ef8f24a2a4ac3a9ea94bd663faf709fb

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_inference_binary-0.2.38-cp311-cp311-macosx_14_0_arm64.whl:

Publisher: build-binary.yml on GDP-ADMIN/gen-ai-internal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gllm_inference_binary-0.2.38-cp311-cp311-macosx_13_0_x86_64.whl.

File metadata

File hashes

Hashes for gllm_inference_binary-0.2.38-cp311-cp311-macosx_13_0_x86_64.whl
Algorithm Hash digest
SHA256 e9a3b39813103fd79a104a45b9697324388801c7d55e5910080f62981b683e41
MD5 fb60e493a0c2e4c9d473c39743738756
BLAKE2b-256 a108839d3eaf0d38f1c411e5e95cbc8da3b79421113b0feeaa452e254d1c8351

See more details on using hashes here.

Provenance

The following attestation bundles were made for gllm_inference_binary-0.2.38-cp311-cp311-macosx_13_0_x86_64.whl:

Publisher: build-binary.yml on GDP-ADMIN/gen-ai-internal

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page