Skip to main content

A packages for sending the inference request to LLM models custom hosted in OCI

Project description

The author of this package has not provided a project description

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oci_inference_custommodel-0.1.0.tar.gz (2.6 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file oci_inference_custommodel-0.1.0.tar.gz.

File metadata

File hashes

Hashes for oci_inference_custommodel-0.1.0.tar.gz
Algorithm Hash digest
SHA256 8f0a4ce6b77741026e3e3ccc2960b75a09998e35a473f98eb4cf24fcba706ae0
MD5 a70e3f5ab92c6665b05885300ad5ac64
BLAKE2b-256 eed2495eae9ce1578bc11bdec9d151b3b53d3d83a7ffc145d6a574d68b50461d

See more details on using hashes here.

File details

Details for the file OCI_Inference_CustomModel-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for OCI_Inference_CustomModel-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ab6e174a0b103e5e3871440beaeef22e4d1ffe76df01f1f67fb65e625b507b9a
MD5 0d8836823350f2384f0f7cc3b0418c4d
BLAKE2b-256 343ce7a26114d5a96973b170bbf40f92d1df0f32bcbf58b004e7a4e5fe8f1483

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page