No project description provided
Project description
Semantix GenAI Inference
A python client library to help you interact with the Semantix GenAI Inference API.
Installation
If you're using pip, just install it from the latest release:
$ pip install semantix-genai-inference
Else if you want to run local, clone this repository and install it with poetry:
$ poetry build
$ poetry install
Usage
To use it:
First, make sure you have a valid API key. You can get one at Semantix Gen AI Hub
Set an environment variable with your api secret:
$ export SEMANTIX_API_SECRET=<YOUR_API_SECRET>
$ semantix-ai --help
Configuring with semantix.yaml
Before using the ModelClient, you need to configure the library with a semantix.yaml file. This file should be placed in the same directory where your application is executed. The semantix.yaml file should contain the necessary API keys and other configuration options for the models you want to use.
Here's an example of a semantix.yaml file:
providers:
semantixHub:
serverId: "YOUR INFERENCE SERVER ID HERE"
version: "v0"
apiSecret: "YOUR SEMANTIX GEN AI HUB API TOKEN"
cohere:
apiKey: "YOUR COHERE API KEY"
generate:
model: "command"
version: "v1"
Replace the placeholders with your actual API keys and other configuration options.
Using ModelClient
The ModelClient class allows you to interact with different models. To create a model client, you need to specify the type of model you want to use. The available options are "alpaca", "llama2", and "cohere".
Here's an example of how to create a model client and generate text using the Alpaca model:
from semantix_genai_inference import ModelClient
# Create an Alpaca model client
client = ModelClient.create("alpaca")
# Generate text using the Alpaca model
prompt = "Once upon a time"
generated_text = client.generate(prompt)
print(generated_text)
You can replace "alpaca" with "llama2" or "cohere" to use the Llama2 or Cohere models, respectively.
DEV - Publish to pypi
$ poetry config pypi-token.pypi <YOUR_PYPI_TOKEN>
$ poetry build
$ poetry publish
DEV - Bump version
$ poetry version patch | minor | major | premajor | preminor | prepatch | prerelease
See more at Poetry version command docs
DEV - Commit message semantics
See at Conventional Commits
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file semantix_genai_inference-0.0.11.tar.gz.
File metadata
- Download URL: semantix_genai_inference-0.0.11.tar.gz
- Upload date:
- Size: 10.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.2.2 CPython/3.10.0 Darwin/21.6.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b6234eea50f3983e89ea37301e3abf7502123a186b40359dd9f515584aa854cd
|
|
| MD5 |
491dd10532be5c09ec50de4f6f2c7713
|
|
| BLAKE2b-256 |
841b625f0aca8e5b44229dc01fa20c028652e5dd45af98e27eb53709cb00e0b0
|
File details
Details for the file semantix_genai_inference-0.0.11-py3-none-any.whl.
File metadata
- Download URL: semantix_genai_inference-0.0.11-py3-none-any.whl
- Upload date:
- Size: 14.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.2.2 CPython/3.10.0 Darwin/21.6.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
aa04b94541bd8ac9253952a4c2b0e3e67b68f8438a7c83d5432f992f89a5afab
|
|
| MD5 |
ca53722ada0f261f9bd3b46b61ad1c7f
|
|
| BLAKE2b-256 |
fe123c957f47c8e7e7ffa5bf8da7dc8ee952b0ef38abf1700f8dfcd6502c7cab
|