Auto-generated JBAI API python client
Project description
JetBrains AI API client
Installation
pip install jbai-client
Usage examples
Create the client:
import os
from jbai_client import JbaiPlatformClient, JbaiAuthType, JbaiEndpoint
client = JbaiPlatformClient(
endpoint=JbaiEndpoint.STAGING,
auth_type=JbaiAuthType.USER,
api_key=os.getenv("JBAI_TOKEN"),
)
LLM API
Get the list of LLM profiles:
response = client.get_llm_profiles_v9()
for profile in response.profiles:
print(profile)
Use LLM chat:
# noinspection PyTypeChecker
from jbai_client.models import ChatModelsStreamV9Request, LLMChatUserMessage, V5LLMChat
for response in client.post_llm_chat_stream_v9(
request=ChatModelsStreamV9Request(
prompt="test",
profile="openai-gpt-4",
chat=V5LLMChat(
messages=[
LLMChatUserMessage(content="Tell me a joke about programmers"),
]
),
)
):
print(response)
Use AsyncJbaiPlatformClient for asynchronous execution:
import asyncio
import os
from jbai_client import AsyncJbaiPlatformClient, JbaiAuthType, JbaiEndpoint
from jbai_client.models import ChatModelsStreamV9Request, LLMChatUserMessage, V5LLMChat
client = AsyncJbaiPlatformClient(
endpoint=JbaiEndpoint.STAGING,
auth_type=JbaiAuthType.USER,
api_key=os.getenv("JBAI_TOKEN"),
)
async def main():
# noinspection PyTypeChecker
async for response in client.post_llm_chat_stream_v9(
request=ChatModelsStreamV9Request(
prompt="test",
profile="openai-gpt-4",
chat=V5LLMChat(
messages=[
LLMChatUserMessage(content="Tell me a joke about programmers"),
]
),
)
):
print(response)
asyncio.run(main())
Tasks API
Get tasks roster:
response = client.get_task_roster()
for task in response.ids:
print(task)
Call a task (optionally specify a task tag via custom header (valid only for STAGING)):
from jbai_client import JbaiHeader
from jbai_client.models import TaskAPIStreamV5TextImproveShortenRequest
# noinspection PyTypeChecker
for response in client.text_improve_shorten_v5(
TaskAPIStreamV5TextImproveShortenRequest(
parameters={
"text": "This is a simple example sentence that could be made shorter and clearer.",
"lang": "en",
}
),
headers={
JbaiHeader.GRAZIE_TASK_TAG: "openai-chat-gpt"
},
):
print(response)
P.S. Most modern IDEs support auto-import functionality, which can simplify the discovery and import of data model classes.
Data models documentation
See MODELS.md
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file jbai_client-2026.1.809.tar.gz.
File metadata
- Download URL: jbai_client-2026.1.809.tar.gz
- Upload date:
- Size: 69.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.4 CPython/3.10.13 Darwin/24.6.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
928e6eaad5d036607b02e6447980df90572f7baceec826c7ffb48aad808906b7
|
|
| MD5 |
d58f5721bb2c6d14d198cbf2de29a40c
|
|
| BLAKE2b-256 |
1dd317e6bbcaf17878cd1a44ca8e69ea39d81a69850ae10d918bf6997264f926
|
File details
Details for the file jbai_client-2026.1.809-py3-none-any.whl.
File metadata
- Download URL: jbai_client-2026.1.809-py3-none-any.whl
- Upload date:
- Size: 457.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.4 CPython/3.10.13 Darwin/24.6.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f80a71e6dffedfa4f3c27c9254ab84b925bd344879ebdb912962454b87bdb6c1
|
|
| MD5 |
fe3d8f9c7a54cb16b0269cb8c16a5dba
|
|
| BLAKE2b-256 |
de506ecb93ac8ec239d5e1ca5b92ee594a4cc9b4719a8355b5916e7bf68991e5
|