No project description provided
Project description
Semantix GenAI Inference
A python client library to help you interact with the Semantix GenAI Inference API.
Installation
If you're using pip, just install it from the latest release:
$ pip install semantix-genai-inference
Else if you want to run local, clone this repository and install it with poetry:
$ poetry build
$ poetry install
Usage
To use it:
First, make sure you have a valid API key. You can get one at Semantix Gen AI Hub
Set an environment variable with your api secret:
$ export SEMANTIX_API_SECRET=<YOUR_API_SECRET>
$ semantix-ai --help
Configuring with semantix.yaml
Before using the ModelClient, you need to configure the library with a semantix.yaml
file. This file should be placed in the same directory where your application is executed. The semantix.yaml
file should contain the necessary API keys and other configuration options for the models you want to use.
Here's an example of a semantix.yaml
file:
providers:
semantixHub:
serverId: "YOUR INFERENCE SERVER ID HERE"
version: "v0"
apiSecret: "YOUR SEMANTIX GEN AI HUB API TOKEN"
cohere:
apiKey: "YOUR COHERE API KEY"
generate:
model: "command"
version: "v1"
Replace the placeholders with your actual API keys and other configuration options.
Using ModelClient
The ModelClient
class allows you to interact with different models. To create a model client, you need to specify the type of model you want to use. The available options are "alpaca", "llama2", and "cohere".
Here's an example of how to create a model client and generate text using the Alpaca model:
from semantix_genai_inference import ModelClient
# Create an Alpaca model client
client = ModelClient.create("alpaca")
# Generate text using the Alpaca model
prompt = "Once upon a time"
generated_text = client.generate(prompt)
print(generated_text)
You can replace "alpaca" with "llama2" or "cohere" to use the Llama2 or Cohere models, respectively.
DEV - Publish to pypi
$ poetry config pypi-token.pypi <YOUR_PYPI_TOKEN>
$ poetry build
$ poetry publish
DEV - Bump version
$ poetry version patch | minor | major | premajor | preminor | prepatch | prerelease
See more at Poetry version command docs
DEV - Commit message semantics
See at Conventional Commits
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for semantix_genai_inference-0.0.6.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4faa86e534367c6156662e8c386c15d70a92e48a08e40a676cf63f1528405c70 |
|
MD5 | 51817a0b461aead14069785832beec45 |
|
BLAKE2b-256 | 1ee06db49ed1f4253aabe789b00caa54cc0731e8f569f278e5d883d88e89dcaf |
Hashes for semantix_genai_inference-0.0.6-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | f2b93bd33b94bfd3cfd22270e03421bd3be48b175286463028646daf57af87f5 |
|
MD5 | ac939fc34fe07e42eb63ab3963b00a5e |
|
BLAKE2b-256 | ba0cc1921fb0ed6a0ee831d4fa59f5037a0db7e316927765cfbc5f0cf43cdf44 |