Call LLM as easily as calling a taxi.
Project description
llm-taxi
Installation
pip install llm-taxi
Usage
Use as a library
import asyncio
from llm_taxi.conversation import Message, Role
from llm_taxi.factory import embedding, llm
async def main():
# Non-streaming response
client = llm(model="openai:gpt-3.5-turbo")
messages = [
Message(role=Role.User, content="What is the capital of France?"),
]
response = await client.response(messages)
print(response)
# Streaming response
client = llm(model="mistral:mistral-small")
messages = [
Message(role=Role.User, content="Tell me a joke."),
]
response = await client.streaming_response(messages)
async for chunk in response:
print(chunk, end="", flush=True)
print()
# Embed text
embedder = embedding("openai:text-embedding-ada-002")
embeddings = await embedder.embed_text("Hello, world!")
print(embeddings[:10])
# Embed texts
embedder = embedding("mistral:mistral-embed")
embeddings = await embedder.embed_texts(["Hello, world!"])
print(embeddings[0][:10])
if __name__ == "__main__":
asyncio.run(main())
Common parameters
temperature
max_tokens
top_k
top_p
stop
seed
presence_penalty
frequency_penalty
response_format
tools
tool_choice
Command line interface
llm-taxi --model openai:gpt-3.5-turbo-16k
See all supported arguments
llm-taxi --help
Supported Providers
Provider | LLM | Embedding |
---|---|---|
Anthropic | ✅ | |
DashScope | ✅ | |
DeepInfra | ✅ | |
DeepSeek | ✅ | |
✅ | ✅ | |
Groq | ✅ | |
Mistral | ✅ | ✅ |
OpenAI | ✅ | ✅ |
OpenRouter | ✅ | |
Perplexity | ✅ | |
Together | ✅ | |
BigModel | ✅ |
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llm_taxi-0.7.0.tar.gz
(1.1 MB
view details)
Built Distribution
llm_taxi-0.7.0-py3-none-any.whl
(24.9 kB
view details)
File details
Details for the file llm_taxi-0.7.0.tar.gz
.
File metadata
- Download URL: llm_taxi-0.7.0.tar.gz
- Upload date:
- Size: 1.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3e4b5ae7600e75001df93e4fe2f6119ab49f4b354f08e126d7c2b752dce3ac4d |
|
MD5 | cb05c7c1de8a6af90decf3d480744895 |
|
BLAKE2b-256 | 49210f1fc7b925b835f47f28ac571e58f69483a22c50c75cc235f171a1b013c3 |
File details
Details for the file llm_taxi-0.7.0-py3-none-any.whl
.
File metadata
- Download URL: llm_taxi-0.7.0-py3-none-any.whl
- Upload date:
- Size: 24.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e9cd5fc10dc02cef7328b2e39af21e08b6c7cd50da622499be985398d00e9b1d |
|
MD5 | c31f151a58cdbfffa8761ca57685e029 |
|
BLAKE2b-256 | eaadc797860d3cc82135019bd7d7cd9b62e53a40fc4ee44d9ed0a20f4cb4502a |