Skip to main content

A packages for sending the inference request to LLM models custom hosted in OCI

Project description

The author of this package has not provided a project description

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oci_inference_custommodel-0.2.0.tar.gz (2.6 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file oci_inference_custommodel-0.2.0.tar.gz.

File metadata

File hashes

Hashes for oci_inference_custommodel-0.2.0.tar.gz
Algorithm Hash digest
SHA256 74555b3df1de5038e2f62d4f59200095922b3b25ed77104ce8797e44fd36a856
MD5 eaf863910ed8ff1b4171bc73222b362c
BLAKE2b-256 625f4ba52c818ca36c32e23015b1413a52aa62303d8d0bbe2c6087960a573b27

See more details on using hashes here.

File details

Details for the file OCI_Inference_CustomModel-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for OCI_Inference_CustomModel-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1ce51f27ea85e7a3a6c4ae578a557098ead77c76fec0831fdd15575eb70a3f6f
MD5 42ab9dc238596f91639df33f37e0ebd6
BLAKE2b-256 2e1959253c7d9213559aa061632b275c022684db54c31c0b20cbe308f4266a22

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page