OrcaKit SDK - Common utilities and adapters for AI Agent development based on LangGraph
Project description
OrcaKit SDK
A public SDK package for AI Agent development based on LangGraph, providing common utilities, MCP adapters, and compatible LLM service support.
Features
- 🔧 MCP Adapter: Integrated Model Context Protocol (MCP) client with multi-server configuration support
- 🤖 Compatible Models: Support for OpenAI-compatible LLM services (e.g., DeepSeek)
- 🛠️ Utility Functions: Message handling, model loading, and other common utilities
- 📦 Lightweight: Minimal core dependencies for easy integration
Installation
pip install orcakit-sdk
Or using uv:
uv add orcakit-sdk
Quick Start
Using OpenAI-Compatible Models
from orcakit_sdk import create_compatible_openai_client
# Create client (requires OPENAI_API_KEY and OPENAI_BASE_URL environment variables)
client = create_compatible_openai_client(model_name="gpt-4")
# Or use default model (read from OPENAI_MODEL_NAME environment variable)
client = create_compatible_openai_client()
Using MCP Adapter
import asyncio
from orcakit_sdk import get_mcp_client, get_mcp_tools
async def main():
# MCP server configuration (JSON format)
server_configs = '''
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/allowed/files"]
}
}
}
'''
# Get MCP client
client = await get_mcp_client(server_configs)
# Get available tools
tools = await get_mcp_tools(server_configs)
print(f"Loaded {len(tools)} tools")
asyncio.run(main())
Loading Chat Models
from orcakit_sdk import load_chat_model
# Load model from fully qualified name
model = load_chat_model("openai/gpt-4")
# Or use OpenAI-compatible model
model = load_chat_model("compatible_openai/deepseek-chat")
Processing Message Content
from orcakit_sdk import get_message_text
from langchain_core.messages import HumanMessage
msg = HumanMessage(content="Hello, world!")
text = get_message_text(msg)
print(text) # Output: Hello, world!
Environment Variables
The SDK supports the following environment variable configurations:
OPENAI_API_KEY: OpenAI API keyOPENAI_BASE_URL: OpenAI API base URL (for compatible services)OPENAI_MODEL_NAME: Default model name
Example .env file:
OPENAI_API_KEY=sk-xxx
OPENAI_BASE_URL=https://api.deepseek.com/v1
OPENAI_MODEL_NAME=deepseek-chat
API Reference
MCP Adapter
get_mcp_client(server_configs: str) -> MultiServerMCPClient | None
Get or initialize the global MCP client.
Parameters:
server_configs: JSON-formatted server configuration string
Returns:
MultiServerMCPClientinstance orNone(if initialization fails)
get_mcp_tools(server_configs: str) -> list[Callable[..., object]]
Get the list of tools provided by the MCP server.
Parameters:
server_configs: JSON-formatted server configuration string
Returns:
- List of tool functions
clear_mcp_cache() -> None
Clear the MCP client and tools cache (mainly for testing).
Model Tools
create_compatible_openai_client(model_name: str | None = None) -> ChatOpenAI
Create an OpenAI-compatible chat model client.
Parameters:
model_name: Model name (optional, defaults to environment variable)
Returns:
ChatOpenAIclient instance
load_chat_model(fully_specified_name: str) -> BaseChatModel
Load a chat model from a fully qualified name.
Parameters:
fully_specified_name: String in the formatprovider/model
Returns:
BaseChatModelinstance
Utility Functions
get_message_text(msg: BaseMessage) -> str
Extract text content from a message object.
Parameters:
msg: LangChain message object
Returns:
- Extracted text content
Development
Install Development Dependencies
make install-dev
Run Tests
make test
Code Formatting
make format
Code Linting
make lint
Project Structure
orcakit-sdk/
├── src/
│ └── orcakit_sdk/
│ ├── __init__.py # Package entry point
│ ├── mcp_adapter.py # MCP adapter
│ ├── model.py # Model utilities
│ └── utils.py # Utility functions
├── tests/
│ ├── unit_tests/
│ │ ├── test_utils.py
│ │ └── test_model.py
│ └── integration_tests/
├── pyproject.toml # Project configuration
├── Makefile # Development commands
└── README.md # Project documentation
Dependencies
Core dependencies:
langgraph >= 0.6.6: LangGraph frameworklangchain >= 0.2.14: LangChain core librarylangchain-openai >= 0.1.22: OpenAI integrationlangchain-mcp-adapters >= 0.1.9: MCP adapterpython-dotenv >= 1.0.1: Environment variable management
License
MIT License
Contributing
Issues and Pull Requests are welcome!
Author
William Fu-Hinthorn (jubaoliang@gmail.com)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file orcakit_sdk-0.1.3.tar.gz.
File metadata
- Download URL: orcakit_sdk-0.1.3.tar.gz
- Upload date:
- Size: 25.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
689a0d0df102d36909074fe781b501a0c2f856b3084fc80773014ea1d4b869a6
|
|
| MD5 |
c09e3fa5c0419eb046a32e568c90382c
|
|
| BLAKE2b-256 |
0fda5e321c8baaed697204f037a41560bd8553b440565350f0c7ed5d852aada4
|
File details
Details for the file orcakit_sdk-0.1.3-py3-none-any.whl.
File metadata
- Download URL: orcakit_sdk-0.1.3-py3-none-any.whl
- Upload date:
- Size: 31.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b2d62372cd318670399686ede3dc66cce7c1a4569101d9adc3dcd565ee3d833c
|
|
| MD5 |
cd3f281e001cb1683bf1630cc97ebbad
|
|
| BLAKE2b-256 |
d8c7a79ff2b7e72fb19122fe86f3855b203cb398c1cbb741f832f0f45b157d7d
|