Skip to main content

OrcaKit SDK - Common utilities and adapters for AI Agent development based on LangGraph

Project description

OrcaKit SDK

A public SDK package for AI Agent development based on LangGraph, providing common utilities, MCP adapters, and compatible LLM service support.

Features

  • 🔧 MCP Adapter: Integrated Model Context Protocol (MCP) client with multi-server configuration support
  • 🤖 Compatible Models: Support for OpenAI-compatible LLM services (e.g., DeepSeek)
  • 🛠️ Utility Functions: Message handling, model loading, and other common utilities
  • 📦 Lightweight: Minimal core dependencies for easy integration

Installation

pip install orcakit-sdk

Or using uv:

uv add orcakit-sdk

Quick Start

Using OpenAI-Compatible Models

from orcakit_sdk import create_compatible_openai_client

# Create client (requires OPENAI_API_KEY and OPENAI_BASE_URL environment variables)
client = create_compatible_openai_client(model_name="gpt-4")

# Or use default model (read from OPENAI_MODEL_NAME environment variable)
client = create_compatible_openai_client()

Using MCP Adapter

import asyncio
from orcakit_sdk import get_mcp_client, get_mcp_tools

async def main():
    # MCP server configuration (JSON format)
    server_configs = '''
    {
        "mcpServers": {
            "filesystem": {
                "command": "npx",
                "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/allowed/files"]
            }
        }
    }
    '''
    
    # Get MCP client
    client = await get_mcp_client(server_configs)
    
    # Get available tools
    tools = await get_mcp_tools(server_configs)
    print(f"Loaded {len(tools)} tools")

asyncio.run(main())

Loading Chat Models

from orcakit_sdk import load_chat_model

# Load model from fully qualified name
model = load_chat_model("openai/gpt-4")

# Or use OpenAI-compatible model
model = load_chat_model("compatible_openai/deepseek-chat")

Processing Message Content

from orcakit_sdk import get_message_text
from langchain_core.messages import HumanMessage

msg = HumanMessage(content="Hello, world!")
text = get_message_text(msg)
print(text)  # Output: Hello, world!

Environment Variables

The SDK supports the following environment variable configurations:

  • OPENAI_API_KEY: OpenAI API key
  • OPENAI_BASE_URL: OpenAI API base URL (for compatible services)
  • OPENAI_MODEL_NAME: Default model name

Example .env file:

OPENAI_API_KEY=sk-xxx
OPENAI_BASE_URL=https://api.deepseek.com/v1
OPENAI_MODEL_NAME=deepseek-chat

API Reference

MCP Adapter

get_mcp_client(server_configs: str) -> MultiServerMCPClient | None

Get or initialize the global MCP client.

Parameters:

  • server_configs: JSON-formatted server configuration string

Returns:

  • MultiServerMCPClient instance or None (if initialization fails)

get_mcp_tools(server_configs: str) -> list[Callable[..., object]]

Get the list of tools provided by the MCP server.

Parameters:

  • server_configs: JSON-formatted server configuration string

Returns:

  • List of tool functions

clear_mcp_cache() -> None

Clear the MCP client and tools cache (mainly for testing).

Model Tools

create_compatible_openai_client(model_name: str | None = None) -> ChatOpenAI

Create an OpenAI-compatible chat model client.

Parameters:

  • model_name: Model name (optional, defaults to environment variable)

Returns:

  • ChatOpenAI client instance

load_chat_model(fully_specified_name: str) -> BaseChatModel

Load a chat model from a fully qualified name.

Parameters:

  • fully_specified_name: String in the format provider/model

Returns:

  • BaseChatModel instance

Utility Functions

get_message_text(msg: BaseMessage) -> str

Extract text content from a message object.

Parameters:

  • msg: LangChain message object

Returns:

  • Extracted text content

Development

Install Development Dependencies

make install-dev

Run Tests

make test

Code Formatting

make format

Code Linting

make lint

Project Structure

orcakit-sdk/
├── src/
│   └── orcakit_sdk/
│       ├── __init__.py          # Package entry point
│       ├── mcp_adapter.py       # MCP adapter
│       ├── model.py             # Model utilities
│       └── utils.py             # Utility functions
├── tests/
│   ├── unit_tests/
│   │   ├── test_utils.py
│   │   └── test_model.py
│   └── integration_tests/
├── pyproject.toml               # Project configuration
├── Makefile                     # Development commands
└── README.md                    # Project documentation

Dependencies

Core dependencies:

  • langgraph >= 0.6.6: LangGraph framework
  • langchain >= 0.2.14: LangChain core library
  • langchain-openai >= 0.1.22: OpenAI integration
  • langchain-mcp-adapters >= 0.1.9: MCP adapter
  • python-dotenv >= 1.0.1: Environment variable management

License

MIT License

Contributing

Issues and Pull Requests are welcome!

Author

William Fu-Hinthorn (jubaoliang@gmail.com)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

orcakit_sdk-0.0.7.tar.gz (23.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

orcakit_sdk-0.0.7-py3-none-any.whl (29.8 kB view details)

Uploaded Python 3

File details

Details for the file orcakit_sdk-0.0.7.tar.gz.

File metadata

  • Download URL: orcakit_sdk-0.0.7.tar.gz
  • Upload date:
  • Size: 23.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for orcakit_sdk-0.0.7.tar.gz
Algorithm Hash digest
SHA256 59fa6055873a67e594822814f146e1415f00424fb122d82ff93c3b7b2ed3551f
MD5 5b6e2a6bcee2f779fc73772274fc9793
BLAKE2b-256 b10bb727d019927944ab5be0f70592a8b5b3786d511752e8bff6c193f044e54b

See more details on using hashes here.

File details

Details for the file orcakit_sdk-0.0.7-py3-none-any.whl.

File metadata

  • Download URL: orcakit_sdk-0.0.7-py3-none-any.whl
  • Upload date:
  • Size: 29.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for orcakit_sdk-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 8afec1b5c90d65569a4183042af234da27f43ce7ddb18318e7b02f068428269a
MD5 a282c1fa5bf4bfcb6a57fbc26d1299f4
BLAKE2b-256 e9e85c54f6d41b799a9bc0780e3b8dad8dd63a5af8ede3b5602f9849e996f0b8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page