Python SDK for PromptQL Natural Language API
Project description
PromptQL Natural Language API SDK for Python
A Python SDK for interacting with the PromptQL Natural Language API.
Features
- Full support for the PromptQL Natural Language API
- Type-safe interface with Pydantic models
- Support for streaming responses
- Conversation management
- Support for all LLM providers (Hasura, Anthropic, OpenAI)
Quick Start
from promptql_api_sdk import PromptQLClient
from promptql_api_sdk.types.models import HasuraLLMProvider
# Initialize the client
client = PromptQLClient(
api_key="your-promptql-api-key",
ddn_url="your-ddn-url",
llm_provider=HasuraLLMProvider(),
timezone="America/Los_Angeles",
)
# Send a simple query
response = client.query("What is the average temperature in San Francisco?")
print(response.assistant_actions[0].message)
# Use streaming for real-time responses
for chunk in client.query("Tell me about the weather in New York", stream=True):
if hasattr(chunk, "message") and chunk.message:
print(chunk.message, end="", flush=True)
Conversation Management
The SDK provides a Conversation class to help manage multi-turn conversations:
# Create a conversation
conversation = client.create_conversation(
system_instructions="You are a helpful assistant that provides weather information."
)
# Send messages in the conversation
response = conversation.send_message("What's the weather like in London?")
print(response.message)
# Send a follow-up message
response = conversation.send_message("How about tomorrow?")
print(response.message)
# Get all artifacts created during the conversation
artifacts = conversation.get_artifacts()
LLM Provider Configuration
The SDK supports multiple LLM providers:
from promptql_api_sdk.types.models import HasuraLLMProvider, AnthropicLLMProvider, OpenAILLMProvider
# Hasura (default)
hasura_provider = HasuraLLMProvider()
# Anthropic
anthropic_provider = AnthropicLLMProvider(api_key="your-anthropic-api-key")
# OpenAI
openai_provider = OpenAILLMProvider(api_key="your-openai-api-key")
# Use with the client
client = PromptQLClient(
api_key="your-promptql-api-key",
ddn_url="your-ddn-url",
llm_provider=anthropic_provider,
)
Error Handling
from promptql_api_sdk import PromptQLClient
from promptql_api_sdk.exceptions import PromptQLAPIError
client = PromptQLClient(...)
try:
response = client.query("What is the weather like?")
except PromptQLAPIError as e:
print(f"API Error: {e}")
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file promptql_api_sdk-0.1.0.tar.gz.
File metadata
- Download URL: promptql_api_sdk-0.1.0.tar.gz
- Upload date:
- Size: 6.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.0.1 CPython/3.13.2 Linux/6.14.2-arch1-1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
49091fe24733ab9b686add6abd1f194b86aba844dec30162261fb5c4aea6ffdc
|
|
| MD5 |
daf65f0ab0bc3bf8b8e4a7659728430f
|
|
| BLAKE2b-256 |
bda5ebea6371f3655138cfea85bc1be0e40365a64e77f49852de569b4aaab980
|
File details
Details for the file promptql_api_sdk-0.1.0-py3-none-any.whl.
File metadata
- Download URL: promptql_api_sdk-0.1.0-py3-none-any.whl
- Upload date:
- Size: 9.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.0.1 CPython/3.13.2 Linux/6.14.2-arch1-1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
10df0507fb250bc4ca02fee600a22d22cf7c62ca930e681ab01f82c3401db614
|
|
| MD5 |
81c63d8dff9c565a4431290603f2fb33
|
|
| BLAKE2b-256 |
8795dae54d7b621f3aa1ad8c80463ee5f648dee51d49eb9ab8dac53b24571817
|