A LangChain wrapper for LLM7 API
Project description
LangChain LLM7 Integration
Official LangChain compatibility layer for LLM7 API services.
Installation
pip install langchain-llm7
Features
- 🚀 Native integration with LangChain's BaseChatModel interface
- ⚡ Support for both streaming and non-streaming responses
- 🔧 Customizable model parameters (temperature, max_tokens, stop sequences)
- 📊 Token usage metadata tracking
- 🛠 Robust error handling and retry mechanisms
Usage
Basic Implementation
from langchain_llm7 import ChatLLM7
from langchain_core.messages import HumanMessage
# Initialize with default parameters
llm = ChatLLM7()
# Basic invocation
response = llm.invoke([HumanMessage(content="Hello!")])
print(response.content)
Streaming Responses
# Enable streaming
llm = ChatLLM7(streaming=True)
for chunk in llm.stream([HumanMessage(content="Tell me about quantum computing")]):
print(chunk.content, end="", flush=True)
Advanced Configuration
# Custom model configuration
llm = ChatLLM7(
model="llama-3.3-70b-instruct-fp8-fast",
temperature=0.7,
max_tokens=500,
stop=["\n", "Observation:"],
timeout=45
)
Parameters
| Parameter | Description | Default |
|---|---|---|
model |
Model version to use | "gpt-4o-mini-2024-07-18" |
base_url |
API endpoint URL | "https://api.llm7.io/v1" |
temperature |
Sampling temperature (0.0-2.0) | 1.0 |
max_tokens |
Maximum number of tokens to generate | None |
timeout |
Request timeout in seconds | 120 |
stop |
Stop sequences for response generation | None |
streaming |
Enable streaming response mode | False |
Error Handling
The library provides detailed error messages for:
- API communication failures
- Invalid message formats
- Unsupported message types
- Response parsing errors
try:
llm.invoke([{"invalid": "message"}])
except ValueError as e:
print(f"Error: {e}")
Testing
To run the test suite:
pip install pytest
pytest tests/
Documentation
For complete documentation see:
Contributing
Contributions are welcome! Please open an issue or submit a PR:
License
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.
Contact
Eugene Evstafev
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file langchain_llm7-2025.9.111220.tar.gz.
File metadata
- Download URL: langchain_llm7-2025.9.111220.tar.gz
- Upload date:
- Size: 10.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3d6e7488edd593beee173c8346212cd2665f32d2cfe2517317e83665dc5a1f99
|
|
| MD5 |
9237167b1fb6d4053e8f43e72c360e46
|
|
| BLAKE2b-256 |
53b9fb88cd9da4bd6e4adbae3d5367f1af4bb4ef408c69425f3b14a6c387ce84
|
File details
Details for the file langchain_llm7-2025.9.111220-py3-none-any.whl.
File metadata
- Download URL: langchain_llm7-2025.9.111220-py3-none-any.whl
- Upload date:
- Size: 12.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.11
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d00e74bf37b93390423e31c260e5fc9412772740cdfac16f35decc00d5d34b7b
|
|
| MD5 |
25c7827b6ca76a9460469c1601874c4c
|
|
| BLAKE2b-256 |
d5f322a0aaf8e617c8efcb995d6af03a539cc62161119b2a40adbb8b44bbee68
|