An integration package connecting Kyma API and LangChain
Project description
langchain-kyma
Kyma API integration for LangChain.
Kyma API is a free LLM gateway providing access to 21+ open-source models (DeepSeek, Qwen, Llama, Gemma, Mistral, and more) via a single OpenAI-compatible endpoint.
Installation
pip install -U langchain-kyma
Setup
Get a free API key at kymaapi.com, then set it as an environment variable:
export KYMA_API_KEY="ky-your-api-key-here"
Usage
from langchain_kyma import ChatKyma
llm = ChatKyma(model="qwen-3.6-plus")
messages = [
("system", "You are a helpful assistant."),
("human", "What is the capital of France?"),
]
ai_msg = llm.invoke(messages)
print(ai_msg.content)
Model aliases
Use convenient aliases to select the best model for your use case:
ChatKyma(model="best") # highest quality (qwen-3.6-plus)
ChatKyma(model="fast") # lowest latency (llama-3.3-70b)
ChatKyma(model="code") # coding tasks (kimi-k2.5)
ChatKyma(model="reasoning") # complex reasoning (deepseek-r1)
ChatKyma(model="agent") # agentic tasks (kimi-k2.5)
ChatKyma(model="vision") # multimodal (gemma-4-31b)
Streaming
for chunk in llm.stream(messages):
print(chunk.content, end="", flush=True)
Tool calling
from pydantic import BaseModel, Field
class GetWeather(BaseModel):
"""Get the current weather in a given location."""
location: str = Field(..., description="City and state, e.g. San Francisco, CA")
llm_with_tools = llm.bind_tools([GetWeather])
ai_msg = llm_with_tools.invoke("What's the weather in Paris?")
print(ai_msg.tool_calls)
Structured output
from pydantic import BaseModel, Field
from typing import Optional
class Joke(BaseModel):
"""Joke to tell user."""
setup: str = Field(description="The setup of the joke")
punchline: str = Field(description="The punchline to the joke")
rating: Optional[int] = Field(default=None, description="Funniness from 1 to 10")
structured_llm = llm.with_structured_output(Joke)
joke = structured_llm.invoke("Tell me a joke about AI")
print(joke)
Async
import asyncio
async def main():
ai_msg = await llm.ainvoke(messages)
print(ai_msg.content)
asyncio.run(main())
Available models
Browse all available models at kymaapi.com/dashboard/models or via the API:
import httpx, os
resp = httpx.get(
"https://kymaapi.com/v1/models",
headers={"Authorization": f"Bearer {os.environ['KYMA_API_KEY']}"},
)
for model in resp.json()["data"]:
print(model["id"])
Links
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langchain_kyma-0.1.0.tar.gz.
File metadata
- Download URL: langchain_kyma-0.1.0.tar.gz
- Upload date:
- Size: 4.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
005d03e91a7fc0b86480546670f20cd0a034a5511d24e666a1a2f64eadedee7c
|
|
| MD5 |
5d56cc718c7acf8a6338608d8f36934c
|
|
| BLAKE2b-256 |
b132a8f2a1c4ada99b7cecac33cb97bd7ef5a6771c85f30b8ac56faa55bf7d83
|
File details
Details for the file langchain_kyma-0.1.0-py3-none-any.whl.
File metadata
- Download URL: langchain_kyma-0.1.0-py3-none-any.whl
- Upload date:
- Size: 5.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6c080a8b4515c78ca192969a74b9f308e6d0126eaa0322a2d13404908bf984d9
|
|
| MD5 |
9a427f29675f2a1016241367962cf587
|
|
| BLAKE2b-256 |
a7c7ec66b7cb1fe22318e9b4625fc9b971f6f4dbd3d2c2565cfa95176f063572
|