Client library for the Liquid AI API
Project description
Python Client for the Liquid API
📦 Installation
pip install -U liquidai
OpenAI compatible API
For openai and langchain compatible apis like /chat/completions and /embeddings.
Examples:
💬 Other liquid api endpoints
For retrieval augmentation enabled liquidai api endpoints like /complete with files as arguments.
To access these APIs you need to set the environment variables LIQUID_URL and LIQUID_API_KEY to the UR and your API key of your Liquid AI subscription respectively.
You can find your API key in the profile tab of the Liquid platform (left bottom icon in the navigation bar).
🔐 API Keys The most secure way to set the environment variables, which the Liquid client will automatically use.
export LIQUID_URL="https://labs.liquid.ai/api/v1"
export LIQUID_API_KEY="9cba1....."
Alternatively, you can also pass the base_url and api_key parameters to the Client constructor.
# Create a client object with the API URL and API key
client = Client()
print("Models: ", client.list_models()) # List all models
# Create a conversation with the model (a list of messages)
chat = [{"role": "user", "content": "Hello world in python!"}]
response = client.complete(chat)
print(f"Response: {response['message']['content']}")
Output:
>>> Models: ['liquid-beacon-1.0']
>>> Response: Here is how to code a Hello World program in Python: print("Hello, world!")
Multi-turn conversations:
chat.append(response["message"]) # add assistant message to conversation
chat.append({"role": "user", "content": "And in C++?"})
response = client.complete(chat)
print(f"Response: {response['message']['content']}")
Output:
>>> #include <iostream>
>>>
>>> int main() {
>>> std::cout << "Hello, World!" << std::endl;
>>> return 0;
>>> }
📚 Adding Knowledge Bases to the Model
# Let's create an example knowledge base
test_file = "test.txt"
with open(test_file, "w") as f:
f.write("The name of the CEO of Liquid is Ramin Hasani.")
# Upload the file to the server
response = client.upload_file(test_file)
print(f"Uploaded {test_file} to {response['filename']}")
files = client.list_files()
print(f"Files: {files}")
Output:
>>> Uploaded test.txt to text.txt
>>> Files: ['text.txt']
Next we can tell the model to use the document we just uploaded:
chat = [
{"role": "user", "content": "Who is the CEO of Liquid?", "files": ["test.txt"]}
]
response = client.complete(chat)
print(f"Response: {response['message']['content']}")
Output:
>>> Response: The CEO of Liquid is Ramin Hasani.
Removing files: Finally we can delete the file from the server:
client.delete_file(test_file)
print(f"Deleted {test_file}")
files = client.list_files()
print(f"Files: {files}")
Output:
>>> Deleted test.txt
>>> Files: []
📌 Full Examples
- Quickstart Full example of the basic usage described above.
- AI2 Reasoning Challenge Runs the AI2 Reasoning Challenge via the Liquid platform.
- Code clone detection benchmark Runs part of the Codegluex code clone detection benchmark
- Upload multiple files Script to upload a folder of files to the Liquid platform.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file liquidai-1.0.1.tar.gz.
File metadata
- Download URL: liquidai-1.0.1.tar.gz
- Upload date:
- Size: 13.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e287d6a8e95aae8ab33fd4743a1950e909e000f8d5868be6f93eae8d3aab8b1e
|
|
| MD5 |
e48277b87d8229db02b18aaf1808f928
|
|
| BLAKE2b-256 |
38b589673e87aa6177b3a41c19c55faa792d7d69eaf8b557f342087cc6279f4a
|
File details
Details for the file liquidai-1.0.1-py3-none-any.whl.
File metadata
- Download URL: liquidai-1.0.1-py3-none-any.whl
- Upload date:
- Size: 12.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
df1a7e22d8bb233460683ebf9dffd12c79e95e4cf89a923a3ee0951aa5917c72
|
|
| MD5 |
8da700355e512b0c0f62b0fbbb6cc60a
|
|
| BLAKE2b-256 |
2b8c28219d52437fb36d36470d7dcc31b93aeeb41b8871499c76c9583b039a61
|