Skip to main content

OrcaKit SDK - Common utilities and adapters for AI Agent development based on LangGraph

Project description

OrcaKit SDK

A public SDK package for AI Agent development based on LangGraph, providing common utilities, MCP adapters, and compatible LLM service support.

Features

  • 🔧 MCP Adapter: Integrated Model Context Protocol (MCP) client with multi-server configuration support
  • 🤖 Compatible Models: Support for OpenAI-compatible LLM services (e.g., DeepSeek)
  • 🛠️ Utility Functions: Message handling, model loading, and other common utilities
  • 📦 Lightweight: Minimal core dependencies for easy integration

Installation

pip install orcakit-sdk

Or using uv:

uv add orcakit-sdk

Quick Start

Using OpenAI-Compatible Models

from orcakit_sdk import create_compatible_openai_client

# Create client (requires OPENAI_API_KEY and OPENAI_BASE_URL environment variables)
client = create_compatible_openai_client(model_name="gpt-4")

# Or use default model (read from OPENAI_MODEL_NAME environment variable)
client = create_compatible_openai_client()

Using MCP Adapter

import asyncio
from orcakit_sdk import get_mcp_client, get_mcp_tools

async def main():
    # MCP server configuration (JSON format)
    server_configs = '''
    {
        "mcpServers": {
            "filesystem": {
                "command": "npx",
                "args": ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/allowed/files"]
            }
        }
    }
    '''
    
    # Get MCP client
    client = await get_mcp_client(server_configs)
    
    # Get available tools
    tools = await get_mcp_tools(server_configs)
    print(f"Loaded {len(tools)} tools")

asyncio.run(main())

Loading Chat Models

from orcakit_sdk import load_chat_model

# Load model from fully qualified name
model = load_chat_model("openai/gpt-4")

# Or use OpenAI-compatible model
model = load_chat_model("compatible_openai/deepseek-chat")

Processing Message Content

from orcakit_sdk import get_message_text
from langchain_core.messages import HumanMessage

msg = HumanMessage(content="Hello, world!")
text = get_message_text(msg)
print(text)  # Output: Hello, world!

Environment Variables

The SDK supports the following environment variable configurations:

  • OPENAI_API_KEY: OpenAI API key
  • OPENAI_BASE_URL: OpenAI API base URL (for compatible services)
  • OPENAI_MODEL_NAME: Default model name

Example .env file:

OPENAI_API_KEY=sk-xxx
OPENAI_BASE_URL=https://api.deepseek.com/v1
OPENAI_MODEL_NAME=deepseek-chat

API Reference

MCP Adapter

get_mcp_client(server_configs: str) -> MultiServerMCPClient | None

Get or initialize the global MCP client.

Parameters:

  • server_configs: JSON-formatted server configuration string

Returns:

  • MultiServerMCPClient instance or None (if initialization fails)

get_mcp_tools(server_configs: str) -> list[Callable[..., object]]

Get the list of tools provided by the MCP server.

Parameters:

  • server_configs: JSON-formatted server configuration string

Returns:

  • List of tool functions

clear_mcp_cache() -> None

Clear the MCP client and tools cache (mainly for testing).

Model Tools

create_compatible_openai_client(model_name: str | None = None) -> ChatOpenAI

Create an OpenAI-compatible chat model client.

Parameters:

  • model_name: Model name (optional, defaults to environment variable)

Returns:

  • ChatOpenAI client instance

load_chat_model(fully_specified_name: str) -> BaseChatModel

Load a chat model from a fully qualified name.

Parameters:

  • fully_specified_name: String in the format provider/model

Returns:

  • BaseChatModel instance

Utility Functions

get_message_text(msg: BaseMessage) -> str

Extract text content from a message object.

Parameters:

  • msg: LangChain message object

Returns:

  • Extracted text content

Development

Install Development Dependencies

make install-dev

Run Tests

make test

Code Formatting

make format

Code Linting

make lint

Project Structure

orcakit-sdk/
├── src/
│   └── orcakit_sdk/
│       ├── __init__.py          # Package entry point
│       ├── mcp_adapter.py       # MCP adapter
│       ├── model.py             # Model utilities
│       └── utils.py             # Utility functions
├── tests/
│   ├── unit_tests/
│   │   ├── test_utils.py
│   │   └── test_model.py
│   └── integration_tests/
├── pyproject.toml               # Project configuration
├── Makefile                     # Development commands
└── README.md                    # Project documentation

Dependencies

Core dependencies:

  • langgraph >= 0.6.6: LangGraph framework
  • langchain >= 0.2.14: LangChain core library
  • langchain-openai >= 0.1.22: OpenAI integration
  • langchain-mcp-adapters >= 0.1.9: MCP adapter
  • python-dotenv >= 1.0.1: Environment variable management

License

MIT License

Contributing

Issues and Pull Requests are welcome!

Author

William Fu-Hinthorn (jubaoliang@gmail.com)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

orcakit_sdk-0.0.3.tar.gz (7.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

orcakit_sdk-0.0.3-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file orcakit_sdk-0.0.3.tar.gz.

File metadata

  • Download URL: orcakit_sdk-0.0.3.tar.gz
  • Upload date:
  • Size: 7.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for orcakit_sdk-0.0.3.tar.gz
Algorithm Hash digest
SHA256 9aea12e55667a110f46ba1536163764a9c1ad4bad205d3435303985d32b04cad
MD5 d3cd962f17ef9451277f9160c9beef00
BLAKE2b-256 f3047d4749fc1223495f7a15341dddbcd9ed806a6c0b5266d10293b04bb88e45

See more details on using hashes here.

File details

Details for the file orcakit_sdk-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: orcakit_sdk-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 8.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for orcakit_sdk-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 07e45c35949218477fde212638ce6e5665c352cea670945a7a4b495835165114
MD5 e1d2d1abba7f7e98af1d48e49255f603
BLAKE2b-256 c9dcfc434ed3e9cc7a5701b5f7a17ab53a547622bcbf72a89df2c793b38e5453

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page